Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers!
Enter the sweepstakes now!Preparing for a certification exam? Ask exam experts all your questions on May 15th. Register now.
I'm a begginer with PBI so I'll try my best to explain problem in details. If more info or explanation needed please ask.
So in my company recently we had a problem with one of the datasets to refresh, SOMETIMES it was dropping an error:
Data source error: Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory MB, memory limit MB, database size before command execution MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more. Table: Facts."
Of course it clearly states it run out of memory. We store datasets in PowerBI Embedded, this particular dataset sits on A2 size PBIE so 5GB of RAM memory:
Dataset itself according to Tabular (Vertipaq analyzer) takes 2.48GB:
Here's the question No. 1: Does dataset during the refresh require "doubled" space as it holds "old" copy when creating refreshed one? But even so, it should just fit A2 size PBI Embedded memory.
So the question was born, how much memory we need during refresh process? I was looking for the answer and found on google that I can run SQL Profiler connected to the workspace via XMLA endpoint to trace what is going on with the dataset.
I successfully connected and traced some smaller models as the one from above 2.48GB takes 35 minutes to refresh.
I took two models for tests:
SQL Profiler showed Peak Memory usage as 93603 KB which is 93.6MB! Is that possible? It's almost 20x more than dataset itself! If number is real how 2.48GB model was able to refresh? In theory it would require around 50GB of memory... (20x more)
When SQL profiler reports PeakMemory: 277084 KB = 277MB! Now 10x more than dataset itself according to Vertipaq analyzer from Tabluar Editor. Same question, how 2.48GB model was able to refresh then?
As a help with the above I was using Chris Webb blog:
So can someone explain to me do I collect the right stats and if yes how the refresh works and how much memory it needs?
Hi @Kris_KB ,
Memory and storage are different.Memory utilisation increases with increase in number of transformation steps.
Always try to filter your data first and apply your functions(grouping,sum etc,..) next.
Thanks,
Sai Teja
Thanks,
Sai Teja
question No. 1: Does dataset during the refresh require "doubled" space as it holds "old" copy when creating refreshed one? But even so, it should just fit A2 size PBI Embedded memory.
yes, it needs twice as much space (plus a bit). And no, it will no longer fit if you refresh the entire dataset. Try refreshing individual partitions.
Check out the April 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
37 | |
25 | |
22 | |
11 | |
10 |