- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
PBI Model refresh failed with memory issue
Hello -
I am facing below error once and all the visuals appear broken with same error -
How can I troubleshoot or resolve this error? What has caused this memory issue particularly during one run and we had successful refreshes after that. Apprecaite any insights.
"Memory error: You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory"
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Broken visuals can be caused by inefficient measures or by other semantic models consuming all the available memory in your capacity. Use the metrics app to check.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @lbendlin @SaiTejaTalasila - Would model refresh failure cause the visuals on the report break as well? I assume that models will not reflect the failures on the report unless processing is complete and reports continue to display the previous refreshed data. we are performing full refresh type on the pbi models in premium capacity.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It's much more nuanced than that. Don't forget that after the semantic model refresh (the mashup) you still have to update the calculated columns and calculated tables when the model is loaded into memory.
And then you can have inefficient measures that can finish the job with a nice big cartesian or two.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @lbendlin - PBI model- is in premium capacity. A full refresh is performed daily and once, it has failed. then recalc operation also failed. All the visuals in the report are broken with error message 'visuals have exceeded the available resources'. I am trying to understand if the visuals are broken because of inefficient dax measures exceeding memory or is it because of recalc operation failure. Any thoughts?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Broken visuals can be caused by inefficient measures or by other semantic models consuming all the available memory in your capacity. Use the metrics app to check.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is Recalc of the model required even when model refresh is performed as "full" refresh type?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
What Power BI or Fabric SKU are you using?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It is PBI model in Power BI Premium
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
What SKU? P1?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I do not know if P1 or what SKU. I am looking for a general process to troubleshoot, analyse the memory issues , consumption, come up with resolution ideas for the memory refresh errors of Premium capacity PBI models.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you for sharing. I am able to get the info that SKU is F64. Would it be possible to face memory allocation failure errors in this case and Visuals not appearing with memory errors.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, of course. A F64 can accommodate models up to about 13 GB in size. Make sure you have Large Storage Format enabled so you can benefit from paging and can push that limit a bit.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @msft-cap ,
Might be you can reduce some data on the dataset and see whether it is able to load the visuals on the report or not.You can check the interactive cpu utilisation too on the fabric capacity metrics app .
Thanks,
Sai Teja

Helpful resources
Subject | Author | Posted | |
---|---|---|---|
10-28-2024 03:05 AM | |||
08-14-2024 01:09 PM | |||
07-24-2024 06:10 AM | |||
08-04-2024 10:06 PM | |||
08-23-2024 08:53 PM |
User | Count |
---|---|
50 | |
46 | |
41 | |
27 | |
13 |