The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi All,
I am trying to refresh large dataset in Premium capacity. I am refreshing it incrementally and dataset has been refreshed for last 4 days and when I was trying to refresh today, after refreshing 4 hours and it threw a below error.
"Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 27137 MB, memory limit 15897 MB, database size before command execution 9702 MB".
Can anyone assisst me on this? Is there any limitation that should be enabled? I have enabled ' Large Dataset storage' under Large dataset storage format.
Thanks in advance.
Solved! Go to Solution.
Hi @Dhinesh ,
This error typically occurs when the dataset consumes more memory than the available limit. Here are a few steps you can try to mitigate this issue:
1.You can do this by limiting the amount of imported data. Go to the Power Query Editor in Power BI Desktop and delete any unnecessary columns.
2.Consider increasing the memory of the Premium capacity where your dataset is hosted. and try to refresh large datasets during off-peak hours.
3.If your datasets are close to half the size of the capacity size (for example, a 12-GB dataset on a 25-GB capacity size), they may exceed the available memory during refreshes. Using the enhanced refresh REST API or the XMLA endpoint, you can perform fine-grained data refreshes, so that the memory needed by the refresh can be reduced.
Large datasets in Power BI Premium - Power BI | Microsoft Learn
Remember, even with these steps, datasets that are too large may still encounter issues during refresh. It’s always a good idea to keep your datasets as streamlined and efficient as possible.
Best regards.
Community Support Team_Caitlyn
Your issues can be achieved by limiting the amount of imported data or optimizing your dataset in other ways.
Alternatively, if you are using Power BI Premium, you may want to consider increasing the memory allocation for the Premium capacity where this dataset is hosted.
Or remove unnecessary Power BI files,datasets from the workspace.
Hi @Dhinesh ,
This error typically occurs when the dataset consumes more memory than the available limit. Here are a few steps you can try to mitigate this issue:
1.You can do this by limiting the amount of imported data. Go to the Power Query Editor in Power BI Desktop and delete any unnecessary columns.
2.Consider increasing the memory of the Premium capacity where your dataset is hosted. and try to refresh large datasets during off-peak hours.
3.If your datasets are close to half the size of the capacity size (for example, a 12-GB dataset on a 25-GB capacity size), they may exceed the available memory during refreshes. Using the enhanced refresh REST API or the XMLA endpoint, you can perform fine-grained data refreshes, so that the memory needed by the refresh can be reduced.
Large datasets in Power BI Premium - Power BI | Microsoft Learn
Remember, even with these steps, datasets that are too large may still encounter issues during refresh. It’s always a good idea to keep your datasets as streamlined and efficient as possible.
Best regards.
Community Support Team_Caitlyn
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Power BI update to learn about new features.
User | Count |
---|---|
55 | |
22 | |
11 | |
11 | |
10 |
User | Count |
---|---|
111 | |
33 | |
28 | |
21 | |
19 |