Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
ACSpotlight
Regular Visitor

Exceed Memory on Refresh

Hi I have a P1 Power Bi Premium capacity and the data set I am importing has a fact table coming from Snowflake that is 5GB with about 230m rows and 16 columns (3 years of data).  The dim tables of which there are 6 in the semantic model are obviously much smaller with the largest being 120mb with about 9m rows and only 4 columns.  I built the model in Power BI Desktop with parameters set to only 1 month of data just for  development and testing purposes.  Once published to the service I have adjusted the parameters to 3 years and I get the below error.  The largest period of data I have had the parameters set to that has successfully refreshed is only 13 months worth.  I have developed similar sized models for another organisation also on a P1 and haven't had these issues.

ACSpotlight_0-1751505359740.png

 

8 REPLIES 8
v-sdhruv
Community Support
Community Support

Hi @ACSpotlight ,

Please consider raising a Support ticket. This might help you to resolve the issue.
Support Ticket

Hope it helps!

v-sdhruv
Community Support
Community Support

Hi @ACSpotlight ,

Just wanted to check if you had the opportunity to review the suggestions provided?
If the response has addressed your query, please accept it as a solution ' so that other members can easily find it.

If you are still facing any issues, please consider rasing a support ticket -
Support Ticket

Thank You!

I have implemented incremental refresh and it I am still getting the error trying to get the intial 3 years data imported.  I have been researching the use of tabular editor to get the this intial load done.

v-sdhruv
Community Support
Community Support

Hi @ACSpotlight ,

 I just wanted to check in and ask if the issue you were facing has been resolved. If not, we’d be happy to assist further—please let us know how we can help.

Looking forward to your update!

Thank You!

v-sdhruv
Community Support
Community Support

Hi @ACSpotlight ,

Even if your dataset is trimmed to 2.8GB, additional memory is consumed by:

  • DAX calculations
  • Relationships and model metadata
  • Query execution overhead
  • Temporary storage during refresh

As pointed out in this thread,

https://community.fabric.microsoft.com/t5/Service/Power-BI-premium-capacity-memory-limitation-during...

“You need twice as much memory as the size of the semantic model. The correct workaround is to implement incremental refresh and/or selective table and partition refresh"

This means you can try to 
1.Implement Incremental Refresh

2.Simplify DAX calculations

3.Consider Gen2 Autoscale or Upgrade

You can try one of these menthods to avaoid hitting the error.
Hope this helps!

If the response has addressed your query, please accept it as a solution so that other members can easily find it.
Thank you.

pankajnamekar25
Super User
Super User

Hello @ACSpotlight 

You're hitting a memory error on refresh in Power BI Service (P1 capacity) after increasing your dataset from 1 month to 3 years. The model exceeds the 25GB per dataset memory limit during processing.

Suggestions

Optimize Power Query steps (remove unused columns early)

Push aggregations to Snowflake

Implement Incremental Refresh to limit data loaded

Monitor memory usage via Premium Metrics app

Split large model into smaller parts or use dataflows

 

 

 

Thanks

 Pankaj Namekar | LinkedIn

If this solution helps, please accept it and give a kudos (Like), it would be greatly appreciated.

Hi Pankaj,  Thankyou for your suggestions.  I removed a couple of columns from the fact table that were only nice to haves.  This reduced the table from 5gb to 4.2gb.  I then filtered the date period down to 2 years which reduces it to 2.8gb.  Finally I created a copy of this and removed all my dax calculations so only the source tables are in the model file.  I then published that and refreshed it and am still getting the same error.  I don't understand how I am hitting the 25gb limit when i am only refreshing about 2.8gb of data.

djurecicK2
Super User
Super User

Hi @ACSpotlight ,

 Keep in mind that with import mode, the model size will double during the refresh process. This is because a copy of the exisiting data is stored for use while the refresh is happening while a copy of the data is refeshed.

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.