Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Dear Fabric Community,
I am reaching out to seek your expertise and advice regarding an issue I've encountered while refreshing data in Power BI service. Despite having a Fabric capacity in F64 and working with a dataset that is not significantly large, I am facing a memory limitation error.
The error message is as follows: "Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 25478 MB, memory limit 25478 MB, database size before command execution 121 MB."
I am utilizing a flexible PostgreSQL database in Azure as my source and refreshing it in the Power BI service. I have also implemented the large semantic model to optimize performance.
I would greatly appreciate any insights or recommendations you may have to resolve this issue. Has anyone else encountered a similar challenge, and if so, how did you address it?
Thank you in advance for your time and assistance.
Solved! Go to Solution.
Hi, @Dan_86
Thanks for the reply from @SaiTejaTalasila and @pallavi_r , please allow me to provide addition:
Simplify your data model by removing unnecessary columns, tables, or relationships. This helps to reduce the memory footprint of the dataset. You can reter to the following links:
Optimization guide for Power BI - Power BI | Microsoft Learn
Consider reducing data granularity. When working with large datasets, you can do this by summarizing the data or by using aggregate tables. Here is the link:
Weird Problem: Why Does Power BI Use So Much Memory? (quicklylearnpowerbi.com)
If you're using Power BI Premium, you can increase the memory of the Premium capacity that hosts your dataset. You can do this by selecting the capacity you want to change in the Power BI admin portal, and then selecting the Change size option.
Manage your Fabric capacity - Microsoft Fabric | Microsoft Learn
Power BI provides a feature called semantic model scale-out, which helps provide fast performance when reports and dashboards are used by a large audience. It uses Premium capacity to host one or more read replicas of the main semantic model.
Power BI semantic model scale-out - Power BI | Microsoft Learn
Power BI's incremental refresh feature is a useful tool for managing large datasets. It allows you to refresh only the data that has changed and not the entire data set.
Large semantic models in Power BI Premium - Power BI | Microsoft Learn
How to Get Your Question Answered Quickly
Best Regards
Yongkang Hua
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi, @Dan_86
Thanks for the reply from @SaiTejaTalasila and @pallavi_r , please allow me to provide addition:
Simplify your data model by removing unnecessary columns, tables, or relationships. This helps to reduce the memory footprint of the dataset. You can reter to the following links:
Optimization guide for Power BI - Power BI | Microsoft Learn
Consider reducing data granularity. When working with large datasets, you can do this by summarizing the data or by using aggregate tables. Here is the link:
Weird Problem: Why Does Power BI Use So Much Memory? (quicklylearnpowerbi.com)
If you're using Power BI Premium, you can increase the memory of the Premium capacity that hosts your dataset. You can do this by selecting the capacity you want to change in the Power BI admin portal, and then selecting the Change size option.
Manage your Fabric capacity - Microsoft Fabric | Microsoft Learn
Power BI provides a feature called semantic model scale-out, which helps provide fast performance when reports and dashboards are used by a large audience. It uses Premium capacity to host one or more read replicas of the main semantic model.
Power BI semantic model scale-out - Power BI | Microsoft Learn
Power BI's incremental refresh feature is a useful tool for managing large datasets. It allows you to refresh only the data that has changed and not the entire data set.
Large semantic models in Power BI Premium - Power BI | Microsoft Learn
How to Get Your Question Answered Quickly
Best Regards
Yongkang Hua
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Dan_86 ,
Copy power queries of some tables and complete your transformations on a dataflow to a possible extinct and load that data to semantic model.It's going to help in reducing some load on your semantic model.If you were able to refresh the dataset in some other time in that case you can change schedule refresh.
Thanks,
Sai Teja
Hi @Dan_86
Here are couple of points to be taken care:
1. Size of the dataset - The F64 equivalent to P1 capacity can handle max 25 gb of memory. I used to get this error with dataset size crossing 13+ gb because during refresh, memory gets doubled.
2. Incremental refresh - hope you have taken care of incremental refresh for large dataset, otherwise these error is inevitable.
3.Calculated columns - Calculated columns makes the model size bigger. So prefer to use measure for aggregated values.
Can you please let us know on these points so we can detect what is the root cause of this error.
Thanks,
Pallavi
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!