Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello,
I hope you are doing well.
Let me begin by giving you the context of my client :
- P1 Premium Capacity
- Query Memory Limit (%) set at 10 %
I have a semantic model B which is plugged in Direct query to a Semantic Model A. One of our key users connect to semantic model B( Live Connection) to create his visuals. when doing so he has managed to bypass our Query Memory Limit which is set at 10 % (he even reached 45% of the capacity on a 30 sec window)
So i have two questions :
- How is it possible ?
- How can we make sure for this use case that our key users do not bypass the query limit ?
Thank you for your precious help !
Solved! Go to Solution.
Hey @RG01,
A few things to check/consider:
Query Memory Limit applies only at dataset level (semantic model A), but when chaining models (B connected to A), some queries can be pushed down differently and not fully respect the limit.
Composite/Chained models often generate more complex queries, which may bypass the set threshold.
To control it, review the report design (limit unrestricted visuals, avoid high cardinality fields).
Apply row-level filters or perspectives to reduce query size for end users.
Monitor with Capacity Metrics App to track which queries are spiking usage.
If critical, consider Dedicated capacity workload settings or limit user access to building unrestricted Live visuals.
Fixed? ✓ Mark it • Share it • Help others!
Best Regards,
Jainesh Poojara | Power BI Developer
Hi @RG01 ,
May I ask if you have resolved this issue? Please let us know if you have any further issues, we are happy to help.
Thank you.
Hi @RG01 ,
We’d like to follow up regarding the recent concern. Kindly confirm whether the issue has been resolved, or if further assistance is still required. We are available to support you and are committed to helping you reach a resolution.
Best Regards,
Chaithra E.
Hi @RG01
From my understanding of query limit, this only applies to semantic models where the data has been imported. So all the querying is actually happening within the semantic model, which will then rely on the query limit to apply. When you're using direct query, this is not actually querying in the semantic model, but rather in the direct query data source, and that is why it is by passing the query limit. But unfortunately you will still be using capacity. For the usage of the reports.
Hey @RG01,
A few things to check/consider:
Query Memory Limit applies only at dataset level (semantic model A), but when chaining models (B connected to A), some queries can be pushed down differently and not fully respect the limit.
Composite/Chained models often generate more complex queries, which may bypass the set threshold.
To control it, review the report design (limit unrestricted visuals, avoid high cardinality fields).
Apply row-level filters or perspectives to reduce query size for end users.
Monitor with Capacity Metrics App to track which queries are spiking usage.
If critical, consider Dedicated capacity workload settings or limit user access to building unrestricted Live visuals.
Fixed? ✓ Mark it • Share it • Help others!
Best Regards,
Jainesh Poojara | Power BI Developer
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!