Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hello everyone,
I have a doubt regarding Semantic model refresh in the Power BI Service. For example, I have a FT1 Capacity (Trial) and I am trying to refresh a semantic model which is connected to a data lake. The table in the datalake, has more than 40GB of size. However, I am getting the following error:
Data source error: Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 25726 MB, memory limit 25597 MB, database size before command execution 2 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more. Table: sales.
That being said, how the refresh memory works ? I can only refresh semantic models that has the size less than the max memory available for the capacity ?
Thanks
Solved! Go to Solution.
Hi, @p_da
While dataset size is important, it's the memory required during refresh that counts. Even if your dataset is 40GB, it doesn't mean that Power BI will need the full 40GB of memory because it uses data compression. However, it still requires a lot of memory resources, especially during transformations and when working with complex queries or large tables in DirectQuery mode.
Reduce dataset size: Consider filtering your data to include only relevant periods (for example, focusing only on the last 12 months).
Incremental refresh: If applicable, implement incremental refresh to avoid loading the entire dataset each time. Incremental refresh only allows new or changed data to be loaded, reducing memory usage.
Complex calculated columns and measures can increase memory usage during refreshes. Simplify these calculations as much as possible or move them to a data source.
: For very large datasets, consider using the DirectQuery mode, which keeps the data in the source system and extracts the data only when needed. This can help avoid memory consumption during refreshes, but it also has other limitations (for example, reporting slower performance during interactions).
I would suggest you to checkout this article to understand the memory consumption for data refresh operation
I addition to the @hackcrr suggestion, I would like to suggest one more approach, you can use enhanced refresh operation keep the commit mode as 'partialBatch', it can reduce the memory consumption.
Please read the complete blog before testing.
Need a Power BI Consultation? Hire me on Upwork
Connect on LinkedIn
|
I would suggest you to checkout this article to understand the memory consumption for data refresh operation
I addition to the @hackcrr suggestion, I would like to suggest one more approach, you can use enhanced refresh operation keep the commit mode as 'partialBatch', it can reduce the memory consumption.
Please read the complete blog before testing.
Need a Power BI Consultation? Hire me on Upwork
Connect on LinkedIn
|
Thanks for sharing the links, I will have a look on it to better understand the refresh consumption for Power BI semantics.
Hi, @p_da
While dataset size is important, it's the memory required during refresh that counts. Even if your dataset is 40GB, it doesn't mean that Power BI will need the full 40GB of memory because it uses data compression. However, it still requires a lot of memory resources, especially during transformations and when working with complex queries or large tables in DirectQuery mode.
Reduce dataset size: Consider filtering your data to include only relevant periods (for example, focusing only on the last 12 months).
Incremental refresh: If applicable, implement incremental refresh to avoid loading the entire dataset each time. Incremental refresh only allows new or changed data to be loaded, reducing memory usage.
Complex calculated columns and measures can increase memory usage during refreshes. Simplify these calculations as much as possible or move them to a data source.
: For very large datasets, consider using the DirectQuery mode, which keeps the data in the source system and extracts the data only when needed. This can help avoid memory consumption during refreshes, but it also has other limitations (for example, reporting slower performance during interactions).
Thanks, I really appreciate your feedback. I will work on some test.