Skip to main content
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

New Member

monitor memory usage during dataset refresh / SQL Profiler / weird PeakMemory usage metrics

I'm a begginer with PBI so I'll try my best to explain problem in details. If more info or explanation needed please ask.

So in my company recently we had a problem with one of the datasets to refresh, SOMETIMES it was dropping an error:





Data source error: Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory MB, memory limit MB, database size before command execution MB. See to learn more. Table: Facts."


Of course it clearly states it run out of memory. We store datasets in PowerBI Embedded, this particular dataset sits on A2 size PBIE so 5GB of RAM memory:



Dataset itself according to Tabular (Vertipaq analyzer) takes 2.48GB:



Here's the question No. 1: Does dataset during the refresh require "doubled" space as it holds "old" copy when creating refreshed one? But even so, it should just fit A2 size PBI Embedded memory.


So the question was born, how much memory we need during refresh process? I was looking for the answer and found on google that I can run SQL Profiler connected to the workspace via XMLA endpoint to trace what is going on with the dataset.




I successfully connected and traced some smaller models as the one from above 2.48GB takes 35 minutes to refresh. 


I took two models for tests:

  • 1. Tiny 5.74MB model. 



SQL Profiler showed Peak Memory usage as 93603 KB which is 93.6MB! Is that possible? It's almost 20x more than dataset itself! If number is real how 2.48GB model was able to refresh? In theory it would require around 50GB of memory... (20x more)




  • 2. Same thing with second dataset I've tested. Tabular Editor reports it's size as 27.7MB



When SQL profiler reports PeakMemory: 277084 KB = 277MB! Now 10x more than dataset itself according to Vertipaq analyzer from Tabluar Editor. Same question, how 2.48GB model was able to refresh then?




As a help with the above I was using Chris Webb blog:


So can someone explain to me do I collect the right stats and if yes how the refresh works and how much memory it needs?



Solution Sage
Solution Sage

Hi @Kris_KB ,


Memory and storage are different.Memory utilisation increases with increase in number of transformation steps.

Always try to filter your data first and apply your functions(grouping,sum etc,..) next.



Sai Teja 



Sai Teja 

Super User
Super User

 question No. 1: Does dataset during the refresh require "doubled" space as it holds "old" copy when creating refreshed one? But even so, it should just fit A2 size PBI Embedded memory.

yes, it needs twice as much space (plus a bit).  And no, it will no longer fit if you refresh the entire dataset. Try refreshing individual partitions.

Helpful resources

July 2024 Power BI Update

Power BI Monthly Update - July 2024

Check out the July 2024 Power BI update to learn about new features.

July Newsletter

Fabric Community Update - July 2024

Find out what's new and trending in the Fabric Community.