Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
AlexanderPowBI
Advocate I
Advocate I

Ingesting Varchar(MAX) to datawarehouse failure

Hello,

 

I am trying to build a pipeline for a table we have in our azure sql database into Fabric. One of the columns has varchar(MAX) and can be quite large (intentionally). 

When I try to use the copy activity with pipeline it fails, as it seems that the warehouse has a limitation of varchar(8000). Have I understood this correctly? Why does this limitation exist? How can I solve this issue?

 

 

//Alexander

1 ACCEPTED SOLUTION

Hi,

So I found it possible to save records larger than the 8000 limit into a delta table in lakehouse, but as you wrote, the SQL analytic endpoint will truncate it. 

I have went on and remodelled the column in a notebook to continue my work. Thank you for your answers! 

View solution in original post

9 REPLIES 9
v-gchenna-msft
Community Support
Community Support

Hi @AlexanderPowBI ,

Thanks for using Fabric Community.

Currently Fabric's data warehouse has a limitation for varchar data types. While varchar(MAX) is supported in Azure SQL Database, Fabric enforces a maximum length of around 8,000 characters for varchar fields.

This limitation is documented in the Microsoft Docs - Data types - Microsoft Fabric | Microsoft Learn

vgchennamsft_0-1713809680902.png


Reasons for the Limitation:

There are a couple of reasons why Fabric might have this limitation:

  • Performance: Storing and processing very large strings can be resource-intensive. Limiting the size helps maintain optimal performance within the warehouse.
  • Data Model Design: Extremely large strings might indicate a poorly designed data model. Splitting the data into smaller chunks or using a separate table for large text fields can improve overall data management.

Solutions for Handling Large Text Fields:

Here are some options for handling your large varchar(MAX) column:

  1. Truncate the Data: If the data exceeding 8,000 characters isn't crucial, consider truncating it before loading into Fabric. You can achieve this during the data extraction process using functions like LEFT or SUBSTRING in your SQL query.
  2. Split the Data: If the entire content is important, you can split the large string into smaller chunks and store them in separate columns. This might require some adjustments to your data model and querying logic.
  3. Use a Separate Table: Consider storing the large text field in a separate table with a foreign key relationship to the main table. This approach keeps your main table size manageable and simplifies querying for frequently used data points.

Hope this is helpful. Please let me know incase of further queries.

Even though the reasons for not having varchar(max) are valid, they are a poor excuse for not supporting it. Every other SQL product in the Microsoft family supports it, but not Fabric. This will make migrations harder. Sometimes we don't have a choice. For example, the source database has a column storing JSON data, which can easily go over 8000 characters. But there's no option to get this data into Fabric...

Hi @Koen_Verbeeck ,

Appreciate if you could share the feedback on our feedback channel. Which would be open for the user community to upvote & comment on. This allows our product teams to consider this request in priority.

Hi,

Thank you for your response. As I cannot change how the data is set up in the app I will need to handle it with one of your options. However, I thoguht of another option that might work and that is to bring the data into a lakehouse delta table. Or does the lakehouse have such limitaiton aswell? 

 

thanks,

Alexander

Hi @AlexanderPowBI ,

Unfortunately both are having the same limitation -

vgchennamsft_0-1713867793193.png


But I suggest you to please give a try. If that doesn't work then you can follow any of the above solutions.

Hope this answers your query. Please let me know in case of further queries.

Hi,

So I found it possible to save records larger than the 8000 limit into a delta table in lakehouse, but as you wrote, the SQL analytic endpoint will truncate it. 

I have went on and remodelled the column in a notebook to continue my work. Thank you for your answers! 

Hi @AlexanderPowBI ,

Thanks for you response. I hope you got some insights while using Fabric Community.
Please continue using Fabric Commnity on your further queries.

Hi @AlexanderPowBI ,

We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .

Thanks

Helpful resources

Announcements
Expanding the Synapse Forums

New forum boards available in Synapse

Ask questions in Data Engineering, Data Science, Data Warehouse and General Discussion.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

April Fabric Update Carousel

Fabric Monthly Update - April 2024

Check out the April 2024 Fabric update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors