Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hello,
I am trying to build a pipeline for a table we have in our azure sql database into Fabric. One of the columns has varchar(MAX) and can be quite large (intentionally).
When I try to use the copy activity with pipeline it fails, as it seems that the warehouse has a limitation of varchar(8000). Have I understood this correctly? Why does this limitation exist? How can I solve this issue?
//Alexander
Solved! Go to Solution.
Hi,
So I found it possible to save records larger than the 8000 limit into a delta table in lakehouse, but as you wrote, the SQL analytic endpoint will truncate it.
I have went on and remodelled the column in a notebook to continue my work. Thank you for your answers!
Hi @AlexanderPowBI ,
Thanks for using Fabric Community.
Currently Fabric's data warehouse has a limitation for varchar data types. While varchar(MAX) is supported in Azure SQL Database, Fabric enforces a maximum length of around 8,000 characters for varchar fields.
This limitation is documented in the Microsoft Docs - Data types - Microsoft Fabric | Microsoft Learn
Reasons for the Limitation:
There are a couple of reasons why Fabric might have this limitation:
Solutions for Handling Large Text Fields:
Here are some options for handling your large varchar(MAX) column:
Hope this is helpful. Please let me know incase of further queries.
Even though the reasons for not having varchar(max) are valid, they are a poor excuse for not supporting it. Every other SQL product in the Microsoft family supports it, but not Fabric. This will make migrations harder. Sometimes we don't have a choice. For example, the source database has a column storing JSON data, which can easily go over 8000 characters. But there's no option to get this data into Fabric...
Hi @Koen_Verbeeck ,
Appreciate if you could share the feedback on our feedback channel. Which would be open for the user community to upvote & comment on. This allows our product teams to consider this request in priority.
There's already an idea for this:
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=b3d67673-dffe-ed11-a81c-000d3ae62779
Hi,
Thank you for your response. As I cannot change how the data is set up in the app I will need to handle it with one of your options. However, I thoguht of another option that might work and that is to bring the data into a lakehouse delta table. Or does the lakehouse have such limitaiton aswell?
thanks,
Alexander
Hi @AlexanderPowBI ,
Unfortunately both are having the same limitation -
But I suggest you to please give a try. If that doesn't work then you can follow any of the above solutions.
Hope this answers your query. Please let me know in case of further queries.
Hi,
So I found it possible to save records larger than the 8000 limit into a delta table in lakehouse, but as you wrote, the SQL analytic endpoint will truncate it.
I have went on and remodelled the column in a notebook to continue my work. Thank you for your answers!
Also, would switching to CosmoDB have taken care of this issue?
Could you please elaborate on what you mean by 'remodelled the column in a notebook' with a snapshot of code from your Notebook? Would help us learn something new.
Hi @AlexanderPowBI ,
Thanks for you response. I hope you got some insights while using Fabric Community.
Please continue using Fabric Commnity on your further queries.
Hi @AlexanderPowBI ,
We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .
Thanks
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
8 | |
6 | |
5 | |
2 | |
1 |
User | Count |
---|---|
15 | |
10 | |
5 | |
5 | |
4 |