Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hi,
I'm having troubles including historic data (already implemented incr. refresh) to my datamodel.
Unfortunately it will always return the a kind of memory error (e.g. "The operation was throttled by the Power BI Premium because the operation was unable to reserve enough memory" in power BI Service or "'Memory error: You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory" in MSSQLSMS)
I understand that somehow the memory of the capacity is not enough, but I can't imagine why this is the case as we are already paying for Premium. Also I already excluded tables created within PowerBI (besides a simple calender) and don't have many new columns, which could be the reason for memory consumption after data is imported and the model gets recalculated. What I do have are some measures, but as I understood they just consume memory while using the report as they are "flexible". Also there are many tables, with some that have many million rows.
So as upgrading to higher capacity is pretty expensive I would like to try to manage without that. Can somebody maybe give tips on what are the possible reasons on high memory consumption that I could integrate?
Best,
Moritz
Solved! Go to Solution.
What I would do is to have open your PBIX on your desktop and then refresh it, see how much memory it consumes when refreshing.
You can monitor this by going into task manager and looking for "msmdsrv.exe" this is the analysis services engine. In my example below it is consuming 1.2GB of memory. If this goes to more than 25GB then you are over your capacility limit.
I would then suggest using DAX Studio and the Vertipaq analyzer to see what is taking up all the space.
What I would do is to have open your PBIX on your desktop and then refresh it, see how much memory it consumes when refreshing.
You can monitor this by going into task manager and looking for "msmdsrv.exe" this is the analysis services engine. In my example below it is consuming 1.2GB of memory. If this goes to more than 25GB then you are over your capacility limit.
I would then suggest using DAX Studio and the Vertipaq analyzer to see what is taking up all the space.
Hi,
thanks for your answer.
Unfortunately the error doesn't arise while importing data to power Bi Desktop, but in the service. As the size of some tables are so big, that I would not be able to load them into the desktop version, so I filter them before via SQL views that are connected via dataflows.
After not being able to solve this issue, the poer bi support team recommended to use autoscale, but unfortunately I get the same error for that. Do you have an idea how this can happen or what to do?
Thanks in advance!
Best,
Moritz
Yeah that is correct.
I would suggest trying Gen2 to see if that solves your issue?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
59 | |
35 | |
27 | |
26 | |
24 |
User | Count |
---|---|
62 | |
53 | |
30 | |
23 | |
22 |