Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreShape the future of the Fabric Community! Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions. Take survey.
Hi All,
I am trying to refresh large dataset in Premium capacity. I am refreshing it incrementally and dataset has been refreshed for last 4 days and when I was trying to refresh today, after refreshing 4 hours and it threw a below error.
"Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 27137 MB, memory limit 15897 MB, database size before command execution 9702 MB".
Can anyone assisst me on this? Is there any limitation that should be enabled? I have enabled ' Large Dataset storage' under Large dataset storage format.
Thanks in advance.
Solved! Go to Solution.
Hi @Dhinesh ,
This error typically occurs when the dataset consumes more memory than the available limit. Here are a few steps you can try to mitigate this issue:
1.You can do this by limiting the amount of imported data. Go to the Power Query Editor in Power BI Desktop and delete any unnecessary columns.
2.Consider increasing the memory of the Premium capacity where your dataset is hosted. and try to refresh large datasets during off-peak hours.
3.If your datasets are close to half the size of the capacity size (for example, a 12-GB dataset on a 25-GB capacity size), they may exceed the available memory during refreshes. Using the enhanced refresh REST API or the XMLA endpoint, you can perform fine-grained data refreshes, so that the memory needed by the refresh can be reduced.
Large datasets in Power BI Premium - Power BI | Microsoft Learn
Remember, even with these steps, datasets that are too large may still encounter issues during refresh. It’s always a good idea to keep your datasets as streamlined and efficient as possible.
Best regards.
Community Support Team_Caitlyn
Your issues can be achieved by limiting the amount of imported data or optimizing your dataset in other ways.
Alternatively, if you are using Power BI Premium, you may want to consider increasing the memory allocation for the Premium capacity where this dataset is hosted.
Or remove unnecessary Power BI files,datasets from the workspace.
Hi @Dhinesh ,
This error typically occurs when the dataset consumes more memory than the available limit. Here are a few steps you can try to mitigate this issue:
1.You can do this by limiting the amount of imported data. Go to the Power Query Editor in Power BI Desktop and delete any unnecessary columns.
2.Consider increasing the memory of the Premium capacity where your dataset is hosted. and try to refresh large datasets during off-peak hours.
3.If your datasets are close to half the size of the capacity size (for example, a 12-GB dataset on a 25-GB capacity size), they may exceed the available memory during refreshes. Using the enhanced refresh REST API or the XMLA endpoint, you can perform fine-grained data refreshes, so that the memory needed by the refresh can be reduced.
Large datasets in Power BI Premium - Power BI | Microsoft Learn
Remember, even with these steps, datasets that are too large may still encounter issues during refresh. It’s always a good idea to keep your datasets as streamlined and efficient as possible.
Best regards.
Community Support Team_Caitlyn
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
User | Count |
---|---|
34 | |
31 | |
21 | |
11 | |
8 |
User | Count |
---|---|
52 | |
40 | |
29 | |
13 | |
12 |