Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
AlexanderPowBI
Resolver I
Resolver I

Ingesting Varchar(MAX) to datawarehouse failure

Hello,

 

I am trying to build a pipeline for a table we have in our azure sql database into Fabric. One of the columns has varchar(MAX) and can be quite large (intentionally). 

When I try to use the copy activity with pipeline it fails, as it seems that the warehouse has a limitation of varchar(8000). Have I understood this correctly? Why does this limitation exist? How can I solve this issue?

 

 

//Alexander

1 ACCEPTED SOLUTION

Hi,

So I found it possible to save records larger than the 8000 limit into a delta table in lakehouse, but as you wrote, the SQL analytic endpoint will truncate it. 

I have went on and remodelled the column in a notebook to continue my work. Thank you for your answers! 

View solution in original post

11 REPLIES 11
Anonymous
Not applicable

Hi @AlexanderPowBI ,

Thanks for using Fabric Community.

Currently Fabric's data warehouse has a limitation for varchar data types. While varchar(MAX) is supported in Azure SQL Database, Fabric enforces a maximum length of around 8,000 characters for varchar fields.

This limitation is documented in the Microsoft Docs - Data types - Microsoft Fabric | Microsoft Learn

vgchennamsft_0-1713809680902.png


Reasons for the Limitation:

There are a couple of reasons why Fabric might have this limitation:

  • Performance: Storing and processing very large strings can be resource-intensive. Limiting the size helps maintain optimal performance within the warehouse.
  • Data Model Design: Extremely large strings might indicate a poorly designed data model. Splitting the data into smaller chunks or using a separate table for large text fields can improve overall data management.

Solutions for Handling Large Text Fields:

Here are some options for handling your large varchar(MAX) column:

  1. Truncate the Data: If the data exceeding 8,000 characters isn't crucial, consider truncating it before loading into Fabric. You can achieve this during the data extraction process using functions like LEFT or SUBSTRING in your SQL query.
  2. Split the Data: If the entire content is important, you can split the large string into smaller chunks and store them in separate columns. This might require some adjustments to your data model and querying logic.
  3. Use a Separate Table: Consider storing the large text field in a separate table with a foreign key relationship to the main table. This approach keeps your main table size manageable and simplifies querying for frequently used data points.

Hope this is helpful. Please let me know incase of further queries.

Even though the reasons for not having varchar(max) are valid, they are a poor excuse for not supporting it. Every other SQL product in the Microsoft family supports it, but not Fabric. This will make migrations harder. Sometimes we don't have a choice. For example, the source database has a column storing JSON data, which can easily go over 8000 characters. But there's no option to get this data into Fabric...

Anonymous
Not applicable

Hi @Koen_Verbeeck ,

Appreciate if you could share the feedback on our feedback channel. Which would be open for the user community to upvote & comment on. This allows our product teams to consider this request in priority.

Hi,

Thank you for your response. As I cannot change how the data is set up in the app I will need to handle it with one of your options. However, I thoguht of another option that might work and that is to bring the data into a lakehouse delta table. Or does the lakehouse have such limitaiton aswell? 

 

thanks,

Alexander

Anonymous
Not applicable

Hi @AlexanderPowBI ,

Unfortunately both are having the same limitation -

vgchennamsft_0-1713867793193.png


But I suggest you to please give a try. If that doesn't work then you can follow any of the above solutions.

Hope this answers your query. Please let me know in case of further queries.

Hi,

So I found it possible to save records larger than the 8000 limit into a delta table in lakehouse, but as you wrote, the SQL analytic endpoint will truncate it. 

I have went on and remodelled the column in a notebook to continue my work. Thank you for your answers! 

Also, would switching to CosmoDB have taken care of this issue?

Could you please elaborate on what you mean by 'remodelled the column in a notebook' with a snapshot of code from your Notebook?  Would help us learn something new.

Anonymous
Not applicable

Hi @AlexanderPowBI ,

Thanks for you response. I hope you got some insights while using Fabric Community.
Please continue using Fabric Commnity on your further queries.

Anonymous
Not applicable

Hi @AlexanderPowBI ,

We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .

Thanks

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.