Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hi,
We migrated from P1 to Fabric at F32. I am aware of the memory thresholds under Fabric. I have enabled the large dataset format. I find the memory error suprising because the dataset seems small (1102) in comparison to the memory limit.
Error:
Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 9460 MB, memory limit 9137 MB, database size before command execution 1102 MB.
I checked the Fabric capcity metrics and the operation took up around 70.000 capacity units and did not exceed ~30%.
Can anyone help us undertand what is going on?
Thanks in advance,
HM
Solved! Go to Solution.
Your dataset (1.1 GB compressed) expands much larger in memory during refresh, and on F32 the per-dataset cap is ~9 GB. Your refresh spiked above that, so it failed.
Fix: optimize the model (fewer columns, lower cardinality, better data types, use dataflows/partitioning) or scale up to F64+.
Hi @HMVGGM,
Just following up to see if the Response provided by community members were helpful in addressing the issue.
If one of the responses helped resolve your query, please consider marking it as the Accepted Solution. Feel free to reach out if you need any further clarification or assistance.
Best regards,
Prasanna Kumar
Hi @HMVGGM,
Just following up to see if the Response provided was helpful in resolving your issue. Please feel free to let us know if you need any further assistance.
Best regards,
Prasanna Kumar
Hi @HMVGGM,
Thank you for reaching out to the Microsoft Fabric Forum Community, and special thanks to @pallavi_r and @Shahid12523 for prompt and helpful responses.
Just following up to see if the Response provided by community members were helpful in addressing the issue.
If one of the responses helped resolve your query, please consider marking it as the Accepted Solution. Feel free to reach out if you need any further clarification or assistance.
Best regards,
Prasanna Kumar
Hi @HMVGGM ,
Are you using any dax calculated columns, In addition to @Shahid12523 suggestion, please change all calculated columns to measures. Because during refresh, calculated columns operation also takes places and that takes lot of memory.
If this post helps, please accept this as a solution. Appreciate your kudos.
Thanks,
Pallavi
Your dataset (1.1 GB compressed) expands much larger in memory during refresh, and on F32 the per-dataset cap is ~9 GB. Your refresh spiked above that, so it failed.
Fix: optimize the model (fewer columns, lower cardinality, better data types, use dataflows/partitioning) or scale up to F64+.
Hi Jaineshp,
Thanks for the suggestions, I will try refreshing during off-peak hours.
I hav ealso read previously that memory doubles during refresh, but with a dataset that is 1102 MB before command execution, i would expect 2204MB? Am I missing something?
Hello @HMVGGM, the response you are referring has been removed. I hope you are able to get a solution to your questions.
Best,
Natalie H.
Community Manager
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.