Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Florcita
Regular Visitor

Refreshing Failure

Hi everyone, i need some help with this error message when refreshing a semantil model:

 

Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 22427 MB, memory limit 22327 MB, database size before command execution 3272 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more. Table: TradeOrder.

What can i do to solve this problem?

Thanks

1 ACCEPTED SOLUTION
v-kpoloju-msft
Community Support
Community Support

Hi @Florcita,
Thank you for reaching out to the Microsoft fabric community forum. Thank you @pallavi_r, and @Deku, for your inputs on this issue.

 

After reviewing the details you provided, I have identified few workarounds that may help resolve the issue. Please follow these steps:

 

If you are using Power BI Pro, consider upgrading to Power BI Premium (P1 or higher). For Premium users, monitor capacity usage in the Power BI Admin Portal and scale up if necessary. To check capacity: Go to Power BI Service → Admin Portal → Capacity Settings → Review memory consumption.
What is Power BI Premium? - Power BI | Microsoft Learn

Enable Large Dataset Storage Format if it is not already enabled, as the dataset might be hitting memory limits. To enable this option in Power BI Service, go to Dataset Settings → Enable Large Dataset Storage Format.

Large semantic models in Power BI Premium - Power BI | Microsoft Learn

Use Aggregations: Pre-aggregate data in the Trade Order table to reduce granularity. Avoid Using Calculated Columns: Instead, create calculations in Power Query or use DAX measures. Remove Unused Relationships: Remove tables not used in relationships to free up memory. Reduce Cardinality: For columns with high cardinality (e.g., timestamps), consider rounding them to date level instead of datetime.
User-defined aggregations - Power BI | Microsoft Learn

For Premium capacity users, Hybrid Tables allow a mix of Import and DirectQuery, reducing memory usage. Ensure filters and transformations are pushed to the source using query folding in Power Query. Use native SQL queries in Power Query when dealing with large datasets from SQL Server.


I hope this could resolve your issue, if you need any further assistance, feel free to reach out.
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you for using Microsoft Community Forum.

View solution in original post

5 REPLIES 5
v-kpoloju-msft
Community Support
Community Support

Hi @Florcita,
Thank you for reaching out to the Microsoft fabric community forum. Thank you @pallavi_r, and @Deku, for your inputs on this issue.

 

After reviewing the details you provided, I have identified few workarounds that may help resolve the issue. Please follow these steps:

 

If you are using Power BI Pro, consider upgrading to Power BI Premium (P1 or higher). For Premium users, monitor capacity usage in the Power BI Admin Portal and scale up if necessary. To check capacity: Go to Power BI Service → Admin Portal → Capacity Settings → Review memory consumption.
What is Power BI Premium? - Power BI | Microsoft Learn

Enable Large Dataset Storage Format if it is not already enabled, as the dataset might be hitting memory limits. To enable this option in Power BI Service, go to Dataset Settings → Enable Large Dataset Storage Format.

Large semantic models in Power BI Premium - Power BI | Microsoft Learn

Use Aggregations: Pre-aggregate data in the Trade Order table to reduce granularity. Avoid Using Calculated Columns: Instead, create calculations in Power Query or use DAX measures. Remove Unused Relationships: Remove tables not used in relationships to free up memory. Reduce Cardinality: For columns with high cardinality (e.g., timestamps), consider rounding them to date level instead of datetime.
User-defined aggregations - Power BI | Microsoft Learn

For Premium capacity users, Hybrid Tables allow a mix of Import and DirectQuery, reducing memory usage. Ensure filters and transformations are pushed to the source using query folding in Power Query. Use native SQL queries in Power Query when dealing with large datasets from SQL Server.


I hope this could resolve your issue, if you need any further assistance, feel free to reach out.
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you for using Microsoft Community Forum.

Hi @Florcita,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

Hi @Florcita,


I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

pallavi_r
Super User
Super User

Hi @Florcita ,

 

There are multiple factors involved as mentioned below. We need to find out is this your real data size or there is any issue in the model which needs to be fixed. The below points ensure that model is optimized.

1. Size of the dataset - The F64 equivalent to P1 capacity can handle max 25 gb of memory. I used to get this error with dataset size crossing 13+ gb because during refresh, memory gets doubled during refresh.
2.Ensure you have dataset in large dataset format
3. Incremental refresh - hope you have taken care of incremental refresh for large dataset, otherwise these error is inevitable.
4.Calculated columns - Calculated columns makes the model size bigger. So prefer to use measure for aggregated values.
5.Reduce the granularity of data wherever possible
6.Check if there is unnecessary columns, tables to reduce the memory of dataset

 

Can you please share more details like what is the sku used and have you tried incremental refresh.

 

Thanks,

Pallavi

Deku
Super User
Super User

  • Incremental refresh 
  • Filter data
  • Remove unused columns
  • Optimize and compression

Hard to say without more context. How big is the semantic model and is this premium or pro?

 


Did I answer your question?
Please help by clicking the thumbs up button and mark my post as a solution!

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.