Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
Hello, One of my reports is giving me error like the below,
Memory error: You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory.
Earlier that same report was not giving me any while refreshing. but from the last few days, I'm facing this error while the On-demand refresh.
I have tried a few suggestions also which are given here in the community. but those are not working for me and still facing errors.
please help me out with that error.
Thanks.
Solved! Go to Solution.
HI @Anonymous
You need to upgrade the shared capacity with premiumn capacity to solve this problem, because you have used resources that exceed the upper limit, or you need to optimize the underlying data source to make the query return less data.
You can download the pbix file and refresh it, see how much memory it consumes when refreshing. You can monitor this by going into task manager and looking for "msmdsrv.exe" this is the analysis services engine.
If you exceed the allowable capacity, then you will want to consider upgrading your capacity to Premium capacity and purchasing the appropriate SKUs.
What is Power BI Premium Gen2? - Power BI | Microsoft Learn
Best Regards,
Community Support Team _ Ailsa Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
HI @Anonymous
You need to upgrade the shared capacity with premiumn capacity to solve this problem, because you have used resources that exceed the upper limit, or you need to optimize the underlying data source to make the query return less data.
You can download the pbix file and refresh it, see how much memory it consumes when refreshing. You can monitor this by going into task manager and looking for "msmdsrv.exe" this is the analysis services engine.
If you exceed the allowable capacity, then you will want to consider upgrading your capacity to Premium capacity and purchasing the appropriate SKUs.
What is Power BI Premium Gen2? - Power BI | Microsoft Learn
Best Regards,
Community Support Team _ Ailsa Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I encountered the same error message one week ago, but I couldn't duplicate the error message. I have even created a test data model with double size, but no error when I published it to PB service. So I guess this is a temporary issue on PB service. Could someone reveal more lights on this type of error?
Here is a similar thread on the issue: https://learn.microsoft.com/en-us/archive/msdn-technet-forums/94758523-b9fa-4567-b3d9-1759abb54a84
I encountered the same error message this morning. I would like to know which factors decides the RAM consumption. Is it the tables size in the data model or a specific transformation queries in Power Query? I assume the DAX measure don't impact RAM consumption because I couldn't open any page in Power BI.
The data model itself is 77 MB, in task manager, it shows the report takes over 2 GB in PB Desktop. Is 77 MB a large data model already?
Hi @Jeanxyz,
I am no expert and quite new to PowerBI but have used a lot of other tools over the years.
It would seem that the dataset is the main thing to look at help reduce the size of your file and help the RAM and even CPU used as a result.
You can reduce the size of your reports dataset by understanding what fields use more data and what to remove. You can see a lot of this information in Dax Studio, but I don't have a link to the guide I used.
A field that has a few options repeated over millions of rows will most likely be compressed with a dictionary and this should compress very well.
Our worst offender was a description field that could be any size and totaly free text.
Removig this and only bringing in the Short Description fixed at 80 chars long instead on all tables took hundreds of MB from the report file size resolving our issue.
I did not design the report and would not do these things myself, but when looking into the design of the report model and data fields being pulled into our report, I removed further fields and even tables that had been duplicating data.
Another thing causing problems in the report was DAX calcualted columns. From my reading, when creating these in DAX, they are not compressed, when doing these calculated columns in the Power Query transform stage, they are compressed. Some things you can't move easily to Power Query, but most of the ones I looked at in our report should of been done in Power Query, reducing the file size further.
Im not sure about Dax Measures, but would think its more CPU power than memory. I tend to keep my Dax measures simple and do as much pre-work in calcualted columns created in Power Query when possible, so my Dax measures mainly sum, count or filter the result of a calcualted column.
There is alot of reading to do on the subject and DAX studio can help you see what is using the most space in your dataset.
Hope this helps.
thanks @pjg , I went through your comments. It seems to me there isn't much I can do to reduce the dataset size. According to DAX studio, the fields that takes large memories are doc IDs, invoice IDs, unfortunately, those fields have to be kept for the current year (~ 1 million records). I have already aggregated history data there.
Not sure if there are other solutions to reduce the data model size.
We had an error like this in our test enviroment, but the report worked fine in our premium live enviroment.
Im not sure if its the exact same message as I didn't record it and was a couple of months ago, but the report PBIX was just over 1gig in size.
We reduced the dataset size by only importing fields from data tables that are used in the report. Some fields had been imported by the developer but not used in the report. A couple of the fields contained a lot of data like ASCII encoded attachements so this worked well. As an extra step, we also excluded very old historical data and kept only a rolling 3 years in the report.
After we dropped the unused fields, limited the number of years of data, it has been working fine since and is about 1/3 of the size.
Im guessing your have tried this as the error message would point you in this direction, but posted just incase as you had not said what you have tried.
Hope it helps.
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
34 | |
30 | |
18 | |
12 | |
8 |
User | Count |
---|---|
50 | |
35 | |
30 | |
14 | |
12 |