March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
I have a simple Gen2 Dataflow that copies data from an on-prem SQL server via the gateway to a lakehouse.
When running the dataflow, I get this error message for all tables:
Error Details: Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: The current row is too large to write. Details: GatewayObjectId: 7f52dde7-e378-4c36-8bcd-3edd76ac1a9e. Request ID: fb4b7dc8-c61c-9ae8-ad30-cf818125c796. (Request ID: 18fa9c00-6122-47c7-8a8a-50c7e88c457c).
The data I'm trying to pull is very small, sometimes only 3 columns and 2 rows.
What could be causing this issue?
Same issue here, in my case with a fresh workspace (after GA release) , staging artifacts untouched..
since nearly 5-6 passed, is there a solution?
Same issue here, "current row is too large to write" error when loading data to lakehouse via Gen 2 dataflow from on prem SQL (on prem gateway)
1. Set up Gen 2 dataflow with multiple queries to an on prem SQL
2. Loaded the single output table to the lakehouse
3. Adding a second set of queries from a different on prem SQL to the same Gen 2 DF, previews load fine but fails to refresh and load the newly added table to lakehouse
4. Removing destination of the new table and refresh still fails with same error message
None of the staging artifacts have been touched since creating the Gen 2 DF
I highly encourage you to raise a support ticket so an engineer can take a closer look at your scenario. You can use the link below to raise a new support ticket:
I did not, I still have the issue.
This could be a symptom of a known issue that we're tracking. Could you please check this article?
I had this issue initially for this reason and I had to recreate the workspace when I realized what was happening. However I'm now encountering this issue without making any changes to the default staging artifacts.
I have not modified or touched any of the staging artifacts. Is that a prerequisite for the issue?
Internally we're investigating this case. At this moment, please go ahead and raise a support ticket by reaching out to the support team.
I get the same error on a query that takes 2 mins in Excel, and in Power Query online about 10 seconds to preview the first 100 rows. None of the tables are staged, but when I try writing to a Warehouse it says:
Mashup Exception Data Source Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Error in replacing table's content with new data in a version: #{0}. Details: Message = The current row is too large to write.;Message.Format = The current row is too large to write.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.