Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I am experiencing a recurring issue with Dataflow Gen2 in Microsoft Fabric. When the data source is an Azure Data Lake Storage (ADLS) or Lakehouse flat files, the Dataflow works fine. However, when the source is a Delta table in the Lakehouse, the process fails with the following error message: "There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Evaluation resulted in a stack overflow and cannot continue. Details: '. Error code: Mashup Exception Error" . And also another error: _WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Evaluation ran out of memory and can't continue. Details: '. Error code: EvaluationOutOfMemoryError.
let me know any ideas on this
Solved! Go to Solution.
Hi @Bhargava05 ,
From the content of your error report there are two error messages: Stack Overflow Error and Out of Memory Error.
For your first error: Stack Overflow Error, I think this is usually the case if there is too much recursion or too deep call stack. So you may need to simplify the query or break it into smaller, more manageable parts.
For your second reported error, Out of Memory Error, this happens when the data stream runs out of memory during evaluation. So I think you can reduce the amount of data you process at each step. Filter out unnecessary columns and rows early in the query. Break the data transformation process into smaller steps and stage the intermediate results. If possible, allocate more memory for the data flow process.
In addition to this, I think you can do the following:
1. make sure the schema of the Delta table matches the expected schema in Dataflow.
2. make sure your Data Gateway is up to date.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thanks for your reply! I appreciate your suggestions regarding the stack overflow and memory issues. However, I noticed that the Dataflow refresh works perfectly fine when the source is Azure Data Lake Storage (ADLS) or Lakehouse flat files, but the issue only arises when the source is a Delta table in the Lakehouse.
Could this behavior suggest any specific incompatibility or additional configurations when using Delta tables as data sources in Dataflows? Is there any difference in how Dataflows handle Delta tables versus flat files, particularly in the case of their interaction with the refresh process?
Any further insight into this would be greatly appreciated!
Best regards,
@v-yilong-msft Thank you for the response. I understand the possible reasons you’ve mentioned for the errors, such as Stack Overflow and Out of Memory issues. However, I still have some concerns and questions:
I’m still curious why such issues occur only with Delta tables in Lakehouse, despite other sources working smoothly. Could this be a specific limitation or a compatibility issue between Dataflow Gen2 and Lakehouse Delta tables?
Looking forward to your insights on this!
Hi @Bhargava05 ,
From the content of your error report there are two error messages: Stack Overflow Error and Out of Memory Error.
For your first error: Stack Overflow Error, I think this is usually the case if there is too much recursion or too deep call stack. So you may need to simplify the query or break it into smaller, more manageable parts.
For your second reported error, Out of Memory Error, this happens when the data stream runs out of memory during evaluation. So I think you can reduce the amount of data you process at each step. Filter out unnecessary columns and rows early in the query. Break the data transformation process into smaller steps and stage the intermediate results. If possible, allocate more memory for the data flow process.
In addition to this, I think you can do the following:
1. make sure the schema of the Delta table matches the expected schema in Dataflow.
2. make sure your Data Gateway is up to date.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
3 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
3 | |
3 | |
2 | |
2 | |
2 |