March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hello,
We recently developed a new reports that has been returning this error
Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 2803 MB, memory limit 2803 MB, database size before command execution 268 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more. From what i have been able to read only and if i'm not mistaken this is due to some of the measures and calculation that exceed the memory limit. Right now we have a Azure Service with Powerbi Embebed and the SKU A1, would upgrading the SKU to A2 resolve this error or is there another way to solve this? We have already reduced the report to the bare minimum of field used and it still returns the error when updating so we are considering the upgrade. |
Solved! Go to Solution.
Yes but "heaviest" is quite relative.
In Measure Killer we have the "What if Analysis" which shows you that. But you might want to also take a look at the query times of those measures via DAX Studio etc to see how long it actually takes to query. It would be a good idea to test one measure at a time like this.
I would do it in the browser in the Power BI Service because there you will see when you hit the memory limit. So try to reduce the filters/measures in the visual that is creating this error and see when it happens to pinpoint what the bottleneck is.
I think an A2 has a maximum of 2GB of memory available so that might be too much as well.
Do you know which query is causing this?
Try to reduce the number of visuals and pin point a DAX expression that might be heavy.
Also I would suggest you to run external tool Measure Killer to remove all unused columns and measures from your model to improve overall performance.
I have already ran measure killer and i have 100% of usage for all measure and fields in the query.
The only thing i can't seem to be able to find is what are the heaviest measure to try to optimize them. Is there a way to find the weight of each measure?
Yes but "heaviest" is quite relative.
In Measure Killer we have the "What if Analysis" which shows you that. But you might want to also take a look at the query times of those measures via DAX Studio etc to see how long it actually takes to query. It would be a good idea to test one measure at a time like this.
I would do it in the browser in the Power BI Service because there you will see when you hit the memory limit. So try to reduce the filters/measures in the visual that is creating this error and see when it happens to pinpoint what the bottleneck is.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
40 | |
26 | |
17 | |
11 | |
10 |
User | Count |
---|---|
57 | |
52 | |
23 | |
13 | |
11 |