Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I’m seeking advice on improving the performance of my Power BI reports, which currently take 30 to 50 seconds to load or refresh. Here's the detailed context of my setup:
Your guidance on how to address these challenges would be invaluable. Thank you in advance for your insights!
Solved! Go to Solution.
Hi @MicrosoftMaster,
I think this should be a normal loading time based on your report data amount, calculations and design.
Is it possible to reduce the stored data amount and calculation fields? Or you can try to switch to live connection mode and put the Dax expressions to the database side?
Regards,
Xiaoxin Sheng
Hi,
Thank you for your suggestion. Unfortunately, I’m unable to reduce the stored data amount or move the DAX expressions to the database side. As per our team's procedures, all measures must remain in the dataset, and we use a live connection for our reporting setup. This approach ensures consistency across multiple reports and datasets.
If you have any alternative recommendations that work within these constraints, I’d be happy to explore them!
Anything above 5 seconds leads to bad user experience. Continue learning about how to use DAX Studio to optimize your measures (SQLBI.com has tons of training videos on that topic). Refactor your queries so they use fewer resources.
Hi @MicrosoftMaster,
I think this should be a normal loading time based on your report data amount, calculations and design.
Is it possible to reduce the stored data amount and calculation fields? Or you can try to switch to live connection mode and put the Dax expressions to the database side?
Regards,
Xiaoxin Sheng
Hi,
Thank you for your suggestion. Unfortunately, I’m unable to reduce the stored data amount or move the DAX expressions to the database side. As per our team's procedures, all measures must remain in the dataset, and we use a live connection for our reporting setup. This approach ensures consistency across multiple reports and datasets.
If you have any alternative recommendations that work within these constraints, I’d be happy to explore them!
Install DAX Studio. Identify the visuals that load the slowest. Use Performance Analyzer to grab the DAX queries. Evaluate those queries in DAX Studio. Examine the query plan and server timings. Refactor the queries. Go back to step 2.
Hi @lbendlin
Thank you for the guidance. I had already used DAX Studio to analyze performance. Specifically, I identified that the measures for financial metrics like Gross IRR and PME, Net IRR and PME are the main contributors to the load time. When calculating differences (e.g., Previous - Current), these metrics add significant overhead, like 7 secs.
Here are some observations:
I’ve attempted optimization within my knowledge, but I’m unsure if further improvements are possible. Since I’m relatively new to Power BI, I’d like to confirm if a total load time of 50 seconds is typical for reports of this scale and complexity or if there are additional steps I can take.
Additionally, I need to assure my team whether this performance is reasonable or if significant improvements are still achievable. Your insights on this would be greatly appreciated!
Additionally, is there any official assistance available for Power BI users in such situations to help identify and address performance issues?
Anything above 5 seconds leads to bad user experience. Continue learning about how to use DAX Studio to optimize your measures (SQLBI.com has tons of training videos on that topic). Refactor your queries so they use fewer resources.
Thank you for your feedback! I’ll definitely explore SQLBI.com.
I wanted to ask for clarification regarding backend solutions. For instance, in some applications, performance can be improved by adding resources like CPU, memory, or storage. In the case of Power BI, are performance issues typically resolved solely by query optimization, or is there scope to address them by adjusting capacity or increasing resources?
Currently, I’m using Fabric capacity on Power BI.
Depends on the connection type. For import mode data sources your only option is to optimize the DAX code. For Direct Query data sources you can also look at the SQL code generated by the queries and apply optimizations at the source (indexes, statistics etc).
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
6 | |
1 | |
1 | |
1 | |
1 |