Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
JasW
Frequent Visitor

Issue with PBI data model refresh

Hi 

 

At the moment, we are in A4 PBI embedded.

Our data model size is around 10.4GB, leaving around 14.6GB for the model refreshing.

However, the data model refresh still failed and error messages indicated due to insufficient of memory.

 

We know that there is an implication rule that the unfolded size of data model will be doubled during the refresh (i.e. model size 12.5GB+ reserve for refreshing 12.5GB). However, it does not applied to our use-case scenario, and it seems to me it is impossible to understand and measure the temporary memory required for the model refresh.

 

Could anyone provide more insight and help for this situation? 

 

Many thanks

2 REPLIES 2
Anonymous
Not applicable

Hi, @JasW 

 

Here are a few suggestions that might help:

  • 1. Optimize your data model: Make sure that your data model is optimized for performance by removing unnecessary columns, creating relationships between tables, and using calculated columns instead of DAX expressions. You can also try using incremental refresh to load only the new or updated data.
  • 2. Use a dedicated data source: If possible, use a dedicated data source for Power BI to improve performance. This can be a SQL Server database, a data warehouse, or a cloud-based data source like Azure SQL Database or Amazon Redshift.
  • 3. Increase the memory allocation: If none of the above solutions work, you may want to consider increasing the amount of memory allocated for the data model refresh. You can do this by upgrading to a higher SKU or by using a larger virtual machine.

 

Best Regards,

Community Support Team _Charlotte

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi Charlotte,

 

Thanks for your response.

 

May I know what is the implication for using incremental refresh? We did try to use incremental refresh but it is not successful as the incremental refresh request the API which knowing that the new or updated data information.

 

We are using Azure Data factory which already part of dedicared data source as you have suggested.

 

I think increase the memory allocation is not part of the consideration as it mean we need to upgrade to another level of PBI embedded SKU and the cost will be double.

 

The issue is the unfold memory during the model refresh is not transparent and unclear for us to understand whether the A4 SKU is enough to complete the refresh. It is really costly if we keep asked for more memory to completed the refresh as we still have at least half memory availble (Our data model size is around 10.4GB, leaving around 14.6GB for the model refreshing as A4 SKU). 

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.