Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I have created a report using Direct Query for 8 SQL queries, which are related to each other in a star schema. When I run the individual queries in SSMS they're all done within a second (returning no more than about 3,000 rows per query, nothing too fancy or large). However, when the query runs from Power BI in a matrix visual it takes several minutes to complete, causing the report in the Service to "exceed the available resources".
In the Desktop version the query manages to complete, but it also takes too long, about 7 minutes(!). I have used the Performance analyzer to see what's happening under the hood of the visual and noticed that the DAX query that is sent to the database looks like this:
SELECT TOP [1000001] *
FROM ( SELECT date, location, SUM(output)
FROM ( query 1 - my fact table )
LEFT JOIN ( query 2 - my dimension table )
WHERE condition
GROUP BY date, location
)
Running this particular query directly on the database, perhaps unsurprisingly, takes as long as in the Service, so I'm inclined to believe that this Power BI made query is the main cause of the slowness.
What I don't understand is how my fast-running SQL query turns into this slow monster and how I can fix this. Does anyone have any suggestions to speed things up or clarifications about the inner workings of Direct Query models?
Solved! Go to Solution.
So if you were importing the data it would be fine. However in Direct Query everything is still being sent back to SQL server to process.
Have a look at the query plan that SQL Server uses when executing the join across the two sub queries (Display an Actual Execution Plan - SQL Server | Microsoft Docs). It might give you a clue.
There is some good advice in here: sql server - SQL Query performance that use subquery and joins - Stack Overflow
It's possible that adding some indexes to the base tables might help to assist the joins. Eg the individual queries are fast but in order to join them it's having to do full scans of the right hand table for every row of the left(certainly sounds like it).
An indexed view lets you take each query and save it as a view but also add indexes to assist in joining those views which will make it more explicit as to what you want SQL Server to do.
Sorry if that's vague. Without seeing your setup it's a hard one to diagnose.
Hi @Anonymous ,
Whether the advice given by @bcdobbs has solved your confusion, if the problem has been solved you can mark the reply for the standard answer to help the other members find it more quickly. If not, please point it out.
Looking forward to your feedback.
Best Regards,
Henry
Hi @Anonymous,
Am I correct that your tables are themselves generated by SQL queries rather than being directly against tables?
There's nothing you can do Power BI side to change the above. The query it's generating is "reasonable" based on what it's provided with. Have a read of: DirectQuery model guidance in Power BI Desktop - Power BI | Microsoft Docs
I think key point here is:
"Ensure required data transformations are materialized".
It sounds like you're transforming on the fly in native SQL statements?
Probably the best thing to look at doing is creating indexed views to effectively materialise the data:
Hello @bcdobbs ,
Thanks so much for your reply! You are spot on that the tables that make up my report are generated by SQL queries, consisting of several joins of tables in the native database.
I've read that particular article before I posted this here, but it looks like I didn't fully understand what they mean by "materializing" data transformations. In my view I was doing that by transforming the data in my native SQL queries, ensuring that the data was all ready when loaded to Power BI. Is that the way to go or is that not what they mean by that statement?
So if you were importing the data it would be fine. However in Direct Query everything is still being sent back to SQL server to process.
Have a look at the query plan that SQL Server uses when executing the join across the two sub queries (Display an Actual Execution Plan - SQL Server | Microsoft Docs). It might give you a clue.
There is some good advice in here: sql server - SQL Query performance that use subquery and joins - Stack Overflow
It's possible that adding some indexes to the base tables might help to assist the joins. Eg the individual queries are fast but in order to join them it's having to do full scans of the right hand table for every row of the left(certainly sounds like it).
An indexed view lets you take each query and save it as a view but also add indexes to assist in joining those views which will make it more explicit as to what you want SQL Server to do.
Sorry if that's vague. Without seeing your setup it's a hard one to diagnose.
Awesome advice! I really wasn't sure where to look (our Service capacity, DAX query or something else), but this has given me some great pointers to investigate.
Your reply is not in the least bit vague! I'm actually impressed, because I think that you understood my problem so well, even without too many details 🙂
I will look into your links tomorrow and let you know if I can make it work. Thanks a lot for now and I'll get back to you.
Great!
A few other "non SQL" things to consider:
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
28 | |
26 | |
22 | |
22 | |
18 |
User | Count |
---|---|
52 | |
34 | |
28 | |
24 | |
21 |