Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
I am running into memory issues (see error message below). The dataset (ABC) is set to incremental refresh for bigger tables and have been working great for over a year. The dataset keeps a rolling 6 months of data so the size stays relatively similar. Same applies for anothe dataset (XYZ) which is still working and its size is also consistent for over a year. Both ABC and XYZ are almost of the similar size (~3GB).
So why all of a sudden, dataset ABC started consuming more memory during refresh when its metadata hasn't changed and the amoun of data imported remains the same? Funny, it works 100% of the time if processed through ALM toolkit or through SSMS as it won't throw memory error. We have analyzed the veritpaq metrics and everything seems as expected. We are in Gen2 Preview Capacity and using P1 SKU. Any ideas how to identify what is consuming that much memory when ABC starts refreshing? FYI - Gen2 utilization metrics app sucks and does not give me what is consuming that much memory. Dont know if DAX studio can profile the PBI refresh?
Data source error: Resource Governing: This operation was canceled because there wasn’t enough memory to finish running it. Either increase the memory of the Premium capacity where this dataset is hosted or reduce the memory footprint of your dataset by doing things like limiting the amount of imported data. More details: consumed memory 23883 MB, memory limit 23844 MB, database size before command execution 1755 MB. Learn more, see https://go.microsoft.com/fwlink/?linkid=2159753.
this article might be of some use to you? https://dax.tips/2021/02/15/visualise-your-power-bi-refresh/
Proud to be a Super User!
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.