Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Dataflow Gen2 crashes with "Evaluation ran out of memory and can't continue", "We're sorry, an error occurred during evaluation", or "Evaluation resulted in a stack overflow and cannot continue" while processing a fairly simple transformation on a large data set. This works OK on 300,000 records, but crashes with 290,000,000 records, so I tried reducing the data to process down to the last 30 days of data which still crashes. This is running on an F64 capacity which I would expect to have plenty of horsepower to solve this transformation.
Complete errors:
There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Evaluation ran out of memory and can't continue. Details: '. Error code: EvaluationOutOfMemoryError. (Request ID: aea5a699-20ae-447e-9bfe-bdba15ce991b).
There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: We're sorry, an error occurred during evaluation. Details: '. Error code: Mashup Exception Error. (Request ID: 589068d9-72d5-4755-a86e-d4dd254bc3a7).
There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Evaluation resulted in a stack overflow and cannot continue. Details: '. Error code: Mashup Exception Error. (Request ID: f6f49f03-f5c8-4d2e-8ea4-5afb186377db).
There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Evaluation resulted in a stack overflow and cannot continue. Details: '. Error code: Mashup Exception Error. (Request ID: a4dc2f6a-d64b-401d-bc30-09af26f3c5c9).
There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: We're sorry, an error occurred during evaluation. Details: '. Error code: Mashup Exception Error. (Request ID: 07bfe4b0-dd05-4c38-aa79-ca16fad251d9).
There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Evaluation resulted in a stack overflow and cannot continue. Details: '. Error code: Mashup Exception Error. (Request ID: 428f9a33-e9e1-4452-b91a-dca4d83527b8).
Query:
let
Source = Lakehouse.Contents(null){[workspaceId = "<removed>"]}[Data]{[lakehouseId = "<removed>"]}[Data],
GetVisitorTable = Source{[Id = "<removed>", ItemKind = "Table"]}[Data],
FilterDays = Table.SelectRows(GetVisitorTable, each Date.IsInPreviousNDays([VisitDate], 30)),
CreateDateNoTime = Table.AddColumn(FilterDays,
"VisitDateNoTime",
each DateTime.Date([VisitDate]),
Date.Type
),
GroupByALP = Table.Group(CreateDateNoTime, {"VisitorID", "VisitDateNoTime", "AccountID", "LocationID", "ProgramID"}, {{"VisitCount", each List.Sum([VisitCount]), type nullable number}})
in
GroupByALP
Solved! Go to Solution.
Hi @EricDupuis,
Thank you for reaching out in Microsoft Community Forum.
please refer below documents to address your issue;
https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-create
https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-azure-data-lake-stora...
please follow below approaches to process the larger data sets in Fabric other than traditional ETL;
1. Use DirectQuery for live queries on large data without loading it into memory.
2. for reducing the workload use incremental refresh to process only new or updated data only.
3. agreegate the data before processing to fabric which is reduced to the amount of data being processed.
Please continue using Microsoft community forum.
If you found this post helpful, please consider marking it as "Accept as Solution" and select "Yes" if it was helpful. help other members find it more easily.
Regards,
Pavan.
Hi @EricDupuis,
I wanted to follow up since we haven't heard back from you regarding our last response. We hope your issue has been resolved.
If the community member's answer your query, please mark it as "Accept as Solution" and select "Yes" if it was helpful.
If you need any further assistance, feel free to reach out.
Please continue using Microsoft community forum.
Thank you,
Pavan.
Hi @EricDupuis,
Thank for reaching out in Microsoft Community Forum.
I trust @nilendraFabric response is accurate and will address your issue.
If you have any further questions or updates regarding your issue, feel free to ask, and we will look into that.
If the Super User's answer meets your requirements, please consider marking it as the "Accept as Solution" and add "kudos". This recognition benefits other members seeking solutions to related queries.
Regards,
Pavan.
Thank you for your moderation @v-pbandela-msft . The responses below explain the problems but don't have a solution. Is there documentation that explains the data limitations in terms of row count, data size, or ?? for Fabric or DataFlow Gen2? What are alternatives to actually process larger data sets in Fabric other than traditional ETL?
Hi @EricDupuis,
Thank you for reaching out in Microsoft Community Forum.
please refer below documents to address your issue;
https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-create
https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-azure-data-lake-stora...
please follow below approaches to process the larger data sets in Fabric other than traditional ETL;
1. Use DirectQuery for live queries on large data without loading it into memory.
2. for reducing the workload use incremental refresh to process only new or updated data only.
3. agreegate the data before processing to fabric which is reduced to the amount of data being processed.
Please continue using Microsoft community forum.
If you found this post helpful, please consider marking it as "Accept as Solution" and select "Yes" if it was helpful. help other members find it more easily.
Regards,
Pavan.
Thank you for your quick response. We have Fast Copy enabled but not required, and are performing small transformations so Fast Copy won't take effect.
I'm new to the Fabric world and continue to be baffled about its limitations. Was this whole Fabric ecosystem another marketing spin on a product that doesn't actually support anything beyond traditional ETL?
Hi @EricDupuis
I am one of very early adopters of Fabric. I can confidently say its way beyond traditional ETL.
Everything is Saasified in Fabric, that makes life simple for companies who want to have a platform ready to develop without any setup hassle.
it’s a comprehensive platform designed to simplify and unify data analytics
As a relatively new product, Fabric is still evolving. Some features may not be as polished or fully featured as older, more established tools Like ADF and Synapse, pbi etc.
PS : this is my personal opinion:)
Hello @EricDupuis
Dataflow Gen2 crashes with memory or stack overflow errors when processing large datasets are often caused by limitations in the underlying Power Query engine
Enable Fast Copy for supported connectors to bypass the Power Query engine and use the more scalable pipeline Copy Activity backend. This can result in significant performance improvement
Intresting read
Fast Copy is currently only supported for the following Dataflow Gen2 connectors: ADLS Gen2, Blob storage, Azure SQL DB, Lakehouse, PostgreSQL.
The copy activity only supports a few transformations when connecting to a file source: combine files, select columns, change data types, rename a column, remove a column.
Please see if this helps.
Thanks
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
9 | |
5 | |
4 | |
3 | |
2 |
User | Count |
---|---|
6 | |
4 | |
3 | |
3 | |
3 |