Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now

Reply
NicholasJackson
Frequent Visitor

"The current row is too large to write" when copying to LakeHouse

I have a simple Gen2 Dataflow that copies data from an on-prem SQL server via the gateway to a lakehouse. 

 

When running the dataflow, I get this error message for all tables: 

 

Error Details: Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: The current row is too large to write. Details: GatewayObjectId: 7f52dde7-e378-4c36-8bcd-3edd76ac1a9e. Request ID: fb4b7dc8-c61c-9ae8-ad30-cf818125c796. (Request ID: 18fa9c00-6122-47c7-8a8a-50c7e88c457c).

 

The data I'm trying to pull is very small, sometimes only 3 columns and 2 rows. 

 

What could be causing this issue? 

10 REPLIES 10
Alex_Rajkov
Advocate III
Advocate III

Same issue here, in my case with a fresh workspace (after GA release) , staging artifacts untouched..

 

since nearly 5-6 passed, is there a solution?

JonathanFlint
Frequent Visitor

Same issue here, "current row is too large to write" error when loading data to lakehouse via Gen 2 dataflow from on prem SQL (on prem gateway)

 

1. Set up Gen 2 dataflow with multiple queries to an on prem SQL

2. Loaded the single output table to the lakehouse

3. Adding a second set of queries from a different on prem SQL to the same Gen 2 DF, previews load fine but fails to refresh and load the newly added table to lakehouse

4. Removing destination of the new table and refresh still fails with same error message

 

None of the staging artifacts have been touched since creating the Gen 2 DF

 

I highly encourage you to raise a support ticket so an engineer can take a closer look at your scenario. You can use the link below to raise a new support ticket:

https://support.fabric.microsoft.com/support

bcdobbs
Super User
Super User

Did you find a solution to this?



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

I did not, I still have the issue. 

miguel
Community Admin
Community Admin

This could be a symptom of a known issue that we're tracking. Could you please check this article? 

Known issue - Staging artifacts aren't available or are misconfigured - Microsoft Fabric | Microsoft...

I had this issue initially for this reason and I had to recreate the workspace when I realized what was happening. However I'm now encountering this issue without making any changes to the default staging artifacts.

I have not modified or touched any of the staging artifacts. Is that a prerequisite for the issue? 

Internally we're investigating this case. At this moment, please go ahead and raise a support ticket by reaching out to the support team.

I get the same error on a query that takes 2 mins in Excel, and in Power Query online about 10 seconds to preview the first 100 rows. None of the tables are staged, but when I try writing to a Warehouse it says:

 

Mashup Exception Data Source Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Error in replacing table's content with new data in a version: #{0}. Details: Message = The current row is too large to write.;Message.Format = The current row is too large to write.

Helpful resources

Announcements
Oct Fabric Update Carousel

Fabric Monthly Update - October 2024

Check out the October 2024 Fabric update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.