Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
Anonymous
Not applicable

Memory error: You have reached the maximum allowable memory allocation for your tier.

Hello, One of my reports is giving me error like the below,
Memory error: You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory.
memory allocation error.png

Earlier that same report was not giving me any while refreshing. but from the last few days, I'm facing this error while the On-demand refresh.
I have tried a few suggestions also which are given here in the community. but those are not working for me and still facing errors.
please help me out with that error.
Thanks.

1 ACCEPTED SOLUTION
Anonymous
Not applicable

HI @Anonymous 

You need to upgrade the shared capacity with premiumn capacity to solve this problem, because you have used resources that exceed the upper limit, or you need to optimize the underlying data source to make the query return less data.  

You can download the pbix file and refresh it, see how much memory it consumes when refreshing. You can monitor this by going into task manager and looking for "msmdsrv.exe" this is the analysis services engine.  

Ailsamsft_0-1672367222311.png

If you exceed the allowable capacity, then you will want to consider upgrading your capacity to Premium capacity and purchasing the appropriate SKUs.

What is Power BI Premium Gen2? - Power BI | Microsoft Learn

 

Best Regards,
Community Support Team _ Ailsa Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

View solution in original post

6 REPLIES 6
Anonymous
Not applicable

HI @Anonymous 

You need to upgrade the shared capacity with premiumn capacity to solve this problem, because you have used resources that exceed the upper limit, or you need to optimize the underlying data source to make the query return less data.  

You can download the pbix file and refresh it, see how much memory it consumes when refreshing. You can monitor this by going into task manager and looking for "msmdsrv.exe" this is the analysis services engine.  

Ailsamsft_0-1672367222311.png

If you exceed the allowable capacity, then you will want to consider upgrading your capacity to Premium capacity and purchasing the appropriate SKUs.

What is Power BI Premium Gen2? - Power BI | Microsoft Learn

 

Best Regards,
Community Support Team _ Ailsa Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

Jeanxyz
Impactful Individual
Impactful Individual

I encountered the same error message one week ago, but I couldn't duplicate the error message. I have even created a test data model with double size, but no error when I published it to PB service. So I guess this is a temporary issue on PB service. Could someone reveal more lights on this type of error? 

Here is a similar thread on the issue: https://learn.microsoft.com/en-us/archive/msdn-technet-forums/94758523-b9fa-4567-b3d9-1759abb54a84

 

Jeanxyz
Impactful Individual
Impactful Individual

I encountered the same error message this morning. I would like to know which factors decides the RAM consumption. Is it the tables size in the data model or  a specific transformation queries in Power Query? I assume the DAX measure don't impact RAM consumption because I couldn't open any page in Power BI. 

 

The data model itself is 77 MB, in task manager, it shows the report takes over 2 GB in PB Desktop. Is 77 MB a large data model already?

@Anonymous 

pjg
Frequent Visitor

Hi @Jeanxyz,
I am no expert and quite new to PowerBI but have used a lot of other tools over the years.
It would seem that the dataset is the main thing to look at help reduce the size of your file and help the RAM and even CPU used as a result.
You can reduce the size of your reports dataset by understanding what fields use more data and what to remove. You can see a lot of this information in Dax Studio, but I don't have a link to the guide I used.


A field that has a few options repeated over millions of rows will most likely be compressed with a dictionary and this should compress very well.
Our worst offender was a description field that could be any size and totaly free text.
Removig this and only bringing in the Short Description fixed at 80 chars long instead on all tables took hundreds of MB from the report file size resolving our issue.

I did not design the report and would not do these things myself, but when looking into the design of the report model and data fields being pulled into our report, I removed further fields and even tables that had been duplicating data.

Another thing causing problems in the report was DAX calcualted columns. From my reading, when creating these in DAX, they are not compressed, when doing these calculated columns in the Power Query transform stage, they are compressed. Some things you can't move easily to Power Query, but most of the ones I looked at in our report should of been done in Power Query, reducing the file size further.

Im not sure about Dax Measures, but would think its more CPU power than memory. I tend to keep my Dax measures simple and do as much pre-work in calcualted columns created in Power Query when possible, so my Dax measures mainly sum, count or filter the result of a calcualted column.

There is alot of reading to do on the subject and DAX studio can help you see what is using the most space in your dataset.

Hope this helps.

Jeanxyz
Impactful Individual
Impactful Individual

thanks @pjg , I went through your comments. It seems to me there isn't much I can do to reduce the dataset size. According to DAX studio, the fields that takes large memories are doc IDs, invoice IDs, unfortunately, those fields have to be kept for the current year (~ 1 million records). I have already aggregated history data there. 

Not sure if there are other solutions to reduce the data model size.

pjg
Frequent Visitor

We had an error like this in our test enviroment, but the report worked fine in our premium live enviroment. 
Im not sure if its the exact same message as I didn't record it and was a couple of months ago, but the report PBIX was just over 1gig in size.
We reduced the dataset size by only importing fields from data tables that are used in the report. Some fields had been imported by the developer but not used in the report. A couple of the fields contained a lot of data like ASCII encoded attachements so this worked well. As an extra step, we also excluded very old historical data and kept only a rolling 3 years in the report.
After we dropped the unused fields, limited the number of years of data, it has been working fine since and is about 1/3 of the size.
Im guessing your have tried this as the error message would point you in this direction, but posted just incase as you had not said what you have tried.
Hope it helps.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.