Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
rks
Resolver II
Resolver II

Monitor memory during refresh

Hi Community,

 

I am looking for ways to gain more insights into what is happening on my capacity during refreshes in terms of memory usage. I have several refreshes that fail because of insufficient memory.

 

I know there are several tools, but I am not sure on how to use them 100%.

 

1) The Capacity Metrics App gives great insights about the CU at any given point in time. Not my problem here. There's no peak in CU during my failed refresh.

2) The Log Analytics Workspace Report. I can see lots of things here, but anything related to memory consumption to a given point?

3) The Log Analytics Workspace - thanks to a KQL query and Chris Webs blog I believe I found my Memory Peak is 7 GB: 

 

[@"PowerBIDatasetsWorkspace"]
| where OperationName == "CommandEnd" and EventText contains "PeakMemory" and TimeGenerated >= (datetime(2023-10-24 10:51:03)) and TimeGenerated <= (datetime(2023-10-24 10:57:50)) and PowerBIWorkspaceName == "Selfservice Datasets"

 

4) The Vertipaq Analyzer - shows my model is 70 GB in size

 

My setting:

* I have a Premium P3 capacity. I understand that each dataset can hold up to 100 GB. That includes compressed data as well as uncompressed data during refresh or data caches for users.

* My refreshes don't refresh the whole model but are more like an incremental refresh for refreshing certain partitions.

 

The question for me as a dataset owner: Why does my refresh fail? I know, because of insufficient memory, but why do I have insufficient memory? 70 GB of compressed data + 7 GB memoryPeak is 77 GB. That should still give the user cache enough memory to work on.

But are 23 GB for queries too less? What queries are consuming how much memory at a given point in time? In Azure Analytics Service there was an option to see the performance monitor with metrics such as memory usage, memory limit and others.

 

My question to you guys: Have you had similar issues? What are your tools in the box to navigate through the memory issue? Are there even some custom templates or KQL queries available that scan the logs or use the capacity metrics dataset to create more insights? 

 

 

 

1 ACCEPTED SOLUTION

Mashup is responsile for all type of refreshes so yes it applies to legacy data sources as well.
That's wired if the max memory during refresh is only 7 GB then it should not cause issue
I would recommand to raise a support request with Microsoft, they would be able to provide you exact memory consumption during refresh as well as anything else that might be consuming memory


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

View solution in original post

9 REPLIES 9
arvindsingh802
Super User
Super User

The mashup memory limit is 50% of capacity memory, that will be 50 GB with P3

arvindsingh802_1-1698818261695.png

Same can be found in capacity metrics app 

arvindsingh802_2-1698818420818.png

 

This would be issue for your refreshed getting failed
I would suggest to create smaller partition in your dataste for incremenatal refresh that could help improving the refresh success rate.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

Hi @arvindsingh802 

Thank you for the insight! From what I know, the mashup memory is dedicated to serve PowerQuery operations. Would you know whether it also applies for legacy data sources?

Besides, incremental refresh is also applied. According to the logs, the memory peak is only 7 GB and thus below the threshold.

Mashup is responsile for all type of refreshes so yes it applies to legacy data sources as well.
That's wired if the max memory during refresh is only 7 GB then it should not cause issue
I would recommand to raise a support request with Microsoft, they would be able to provide you exact memory consumption during refresh as well as anything else that might be consuming memory


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

Thank you. I've raised an issue. I'll update this post with whatever will come up, for now, involving support is the solution.

Yes please do share, it would be learning opportunity for everyone here in community


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!
rks
Resolver II
Resolver II

Hi @GilbertQ 

 

Thank you for your input. The refresh is scheduled every hour and it fails once or twice per day and succeeds all other times. Regarding the memory, what is a capacity memory limit in Gen2?  (I just know the 100GB limit per dataset.) There are two other larger datasets present on the capacity but they don't seem to cause the problem.

 

We have >100 users that query the dataset and quite a lot of them connect via Excel (according to the metrics app). They're my culprit. I believe that they cause some high memory consumption with certain queries during refresh.

 

Which leads me to a question whether there's a setting that can be enabled to kill the queries in order to give priority to refresh operations?

 

 

The 100GB is for your whole capacity and not just one dataset. All the resources are shared across the capacity. That would also go in line with it hitting the limit sometimes (when people are more active in your tenant maybe?)

 

The whole P3 capacity has 100GB Ram I think.

 

I dont know if you can programmatically kill those MDX queries based on consumption. You could turn it off alltogether which is not feasable I guess 😉

 

Did you think about cutting down the size of this dataset? Maybe not all of the 70GB is actually what people need? Tools like Measure Killer can analyze all connected reports and make sure nothing breaks down - maybe a big chunk of your data model is actually not needed...

------------------------------------
Brunner BI focusing on Microsoft Power BI development and consulting
Developers of external tool "Measure Killer"
My blog

Hi @rks 

 

You ideally do not want to kill sessions, as people are using the dataset.


What I would suggest is to look if you can reduce the dataset size, there are other more advanced settings that you could use such as IsAvailableInMDX which can be set to false if the column is not used in any tables shown or if the column is used in a relationship.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

GilbertQ
Super User
Super User

Hi @rks 

 

It looks to me like even though there is 23GB of free space, as there any other datasets consuming memory on this P3 capacity? As they too would add to the total memory consumed.

 

If you had to refresh a smaller table does it refresh?

If so then it might be that there is some limit that being hit close to 75% overall memory usage.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.