Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
Bhargava05
Regular Visitor

Error While Refreshing Dataflow with Lakehouse Delta Tables - Stack Overflow Error-Mashup Exception

I am experiencing a recurring issue with Dataflow Gen2 in Microsoft Fabric. When the data source is an Azure Data Lake Storage (ADLS) or Lakehouse flat files, the Dataflow works fine. However, when the source is a Delta table in the Lakehouse, the process fails with the following error message: "There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Evaluation resulted in a stack overflow and cannot continue. Details: '. Error code: Mashup Exception Error" . And also another error: _WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Evaluation ran out of memory and can't continue. Details: '. Error code: EvaluationOutOfMemoryError. 

 

let me know any ideas on this

1 ACCEPTED SOLUTION
v-yilong-msft
Community Support
Community Support

Hi @Bhargava05 ,

From the content of your error report there are two error messages: Stack Overflow Error and Out of Memory Error.

 

For your first error: Stack Overflow Error, I think this is usually the case if there is too much recursion or too deep call stack. So you may need to simplify the query or break it into smaller, more manageable parts.

 

For your second reported error, Out of Memory Error, this happens when the data stream runs out of memory during evaluation. So I think you can reduce the amount of data you process at each step. Filter out unnecessary columns and rows early in the query. Break the data transformation process into smaller steps and stage the intermediate results. If possible, allocate more memory for the data flow process.

 

In addition to this, I think you can do the following:

1. make sure the schema of the Delta table matches the expected schema in Dataflow.

2. make sure your Data Gateway is up to date.

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

View solution in original post

3 REPLIES 3
Bhargava05
Regular Visitor

Hi @v-yilong-msft 

Thanks for your reply! I appreciate your suggestions regarding the stack overflow and memory issues. However, I noticed that the Dataflow refresh works perfectly fine when the source is Azure Data Lake Storage (ADLS) or Lakehouse flat files, but the issue only arises when the source is a Delta table in the Lakehouse.

Could this behavior suggest any specific incompatibility or additional configurations when using Delta tables as data sources in Dataflows? Is there any difference in how Dataflows handle Delta tables versus flat files, particularly in the case of their interaction with the refresh process?

Any further insight into this would be greatly appreciated!

Best regards,

Bhargava05
Regular Visitor

@v-yilong-msft  Thank you for the response. I understand the possible reasons you’ve mentioned for the errors, such as Stack Overflow and Out of Memory issues. However, I still have some concerns and questions:

  • The Dataflow works perfectly when the data source is either ADLS flat files or Lakehouse flat files. Why does the process fail specifically when switching to Lakehouse Delta tables as the source?
  • The query complexity and the schema remain the same across these scenarios. Does Delta introduce additional overhead or incompatibility in Dataflow Gen2 compared to flat file sources?
  • Also, the Delta table schema is already aligned with the expected schema in Dataflow, and my Data Gateway is up-to-date, so these factors shouldn’t be contributing to the errors.

I’m still curious why such issues occur only with Delta tables in Lakehouse, despite other sources working smoothly. Could this be a specific limitation or a compatibility issue between Dataflow Gen2 and Lakehouse Delta tables?

Looking forward to your insights on this!

v-yilong-msft
Community Support
Community Support

Hi @Bhargava05 ,

From the content of your error report there are two error messages: Stack Overflow Error and Out of Memory Error.

 

For your first error: Stack Overflow Error, I think this is usually the case if there is too much recursion or too deep call stack. So you may need to simplify the query or break it into smaller, more manageable parts.

 

For your second reported error, Out of Memory Error, this happens when the data stream runs out of memory during evaluation. So I think you can reduce the amount of data you process at each step. Filter out unnecessary columns and rows early in the query. Break the data transformation process into smaller steps and stage the intermediate results. If possible, allocate more memory for the data flow process.

 

In addition to this, I think you can do the following:

1. make sure the schema of the Delta table matches the expected schema in Dataflow.

2. make sure your Data Gateway is up to date.

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

JanFabricDE_carousel

Fabric Monthly Update - January 2025

Explore the power of Python Notebooks in Fabric!

JanFabricDW_carousel

Fabric Monthly Update - January 2025

Unlock the latest Fabric Data Warehouse upgrades!

JanFabricDF_carousel

Fabric Monthly Update - January 2025

Take your data replication to the next level with Fabric's latest updates!