The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
"Query Memory Limit (%)" is a setting on a Premium Capacity.
This page describes it as "The maximum percentage of available memory that can be used for temporary results in a query or DAX measure." And this page says, "Some queries/calculations can result in intermediate results that use a lot of memory on the capacity. This can cause other queries to execute very slowly, cause eviction of other datasets from the capacity, and lead to out of memory errors for other users of the capacity. Without this setting, capacity administrators would find it challenging to identify which report/query is causing the problem, so that they could work with the report author to improve the performance. With this new setting, admins can better control impact of bad or expensive report on others using the capacity."
We do have some memory-intensive reports, and our capacity can be over-used. I think this setting might be useful, but I cannot find any more information about it.
First of all, it defaults to '0' ... and I am wondering what '0' means. Does this mean no limit is applied to any given query?
And then, I am wondering what the community has discovered is a reasonable range of values to use. I feel like maybe 5 or 10 percent? I feel like a single query should not take up too much resources! But then I don't want to set an overly-restrictive limit.
Oops, I just fixed a typo in my original post.
Another critical question about this which should be made clear, please, if anyone can answer it: is this a per-query limit, or is this a cap on the combined memory consumed by all queries?
E.g., say I have set "Query Memory Limit (%)" to 30. Then, say we have multiple simultaneous queries/DAX measures in reports being rendered. At a given time, is the combined total memory consumed for all temporary results of all the queries/DAX measures 30%? Or, is each query limited to consuming a max of 30% of memory?
Thanks, @v-shex-msft ...
Our capacity does have one very large dataset with some complex reports/queries; plus several other medium/small-sized datasets. Capacity over-utilization is an issue.
Does anyone have any real-life experience using this setting? What values did you test and ultimately use?
Hi @kevhav,
#1, I also check these premium documents but not found clearly describe this feature. I guess '0' should mean 'no limit' or use the 'default' limit amount.
#2, It should be related to your capacity usage(high resource usage and low resource usage) :
1. If only one dataset assigns in premium capacity, you do not need to configure this parameter.
2. If they are high resource usage by other datasets, you can consider configuring a small number to release the resource which used by query operations/calculations.
3. If your tenant does not contain datasets that spend large amounts of memory resources, you can increase the parameter amount to increase query processing speed. (If your dataset not very large, you can set parameters from 25% to 50%. If they are too many records stored in your dataset, you can consider configure it less than 45% to reduce the performance effects with other datasets)
Regards,
Xiaoxin Sheng
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
52 | |
22 | |
11 | |
11 | |
9 |
User | Count |
---|---|
112 | |
32 | |
30 | |
20 | |
19 |