Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
MicrosoftMaster
Frequent Visitor

Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures

I’m seeking advice on improving the performance of my Power BI reports, which currently take 30 to 50 seconds to load or refresh. Here's the detailed context of my setup:


Report Details:

  • The report contains 40 columns:
    • 12 metrics, each with 4 associated columns (e.g., Prior Value, Current Value, Change in $, Change in %).
    • Example: For the metric "Committed," we calculate the prior value (min of selected period), current value (max of selected period), and their difference in both $ and %.
    • Most calculations are straightforward, except for 4 metrics (e.g., Gross IRR, PME, Net IRR), which involve 40–50 lines of DAX code with complex logic.
  • It uses a Matrix visual with:
    • 4 row-level groupings.
    • 60 rows of data displayed in the Matrix.

Dataset Details:

  • The report connects live to a shared semantic model hosted on the Power BI Service:
    • Dataset size: 1.75 GB.
    • It includes 37 tables and 140 measures.
  • The data model is common across many reports, with multiple reports taking 30–50 seconds to load.

Key Observations:

  1. When I connect the report to a local dataset (imported version), performance improves slightly.
  2. Converting the Matrix visual to a Table visual reduces the load time by ~15 seconds, but I cannot change this visual for our use case.
  3. Data volume will continue to grow over time, and optimizing measures further is not feasible at the moment.

Questions:

  1. Given the size of the dataset (1.75 GB) and the complexity of the measures, is the current performance (30–50 seconds) reasonable? Is it acceptable to have this load time for reports in General ?
  2. Is there room for significant improvement (e.g., reducing load time to 1–5 seconds) without altering the core dataset or report structure?
  3. Are there capacity or memory-related adjustments I can explore to improve performance, such as better leveraging Power BI Service capacities?
  4. Are there any modeling or visualization best practices for handling such large datasets in live connection scenarios?

Your guidance on how to address these challenges would be invaluable. Thank you in advance for your insights!

3 ACCEPTED SOLUTIONS
v-shex-msft
Community Support
Community Support

Hi @MicrosoftMaster,

I think this should be a normal loading time based on your report data amount, calculations and design.
Is it possible to reduce the stored data amount and calculation fields? Or you can try to switch to live connection mode and put the Dax expressions to the database side?

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.

View solution in original post

Hi,

Thank you for your suggestion. Unfortunately, I’m unable to reduce the stored data amount or move the DAX expressions to the database side. As per our team's procedures, all measures must remain in the dataset, and we use a live connection for our reporting setup. This approach ensures consistency across multiple reports and datasets.

If you have any alternative recommendations that work within these constraints, I’d be happy to explore them!

View solution in original post

Anything above 5 seconds leads to bad user experience.  Continue learning about how to use DAX Studio to optimize your measures (SQLBI.com has tons of training videos on that topic). Refactor your queries so they use fewer resources.

View solution in original post

7 REPLIES 7
v-shex-msft
Community Support
Community Support

Hi @MicrosoftMaster,

I think this should be a normal loading time based on your report data amount, calculations and design.
Is it possible to reduce the stored data amount and calculation fields? Or you can try to switch to live connection mode and put the Dax expressions to the database side?

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.

Hi,

Thank you for your suggestion. Unfortunately, I’m unable to reduce the stored data amount or move the DAX expressions to the database side. As per our team's procedures, all measures must remain in the dataset, and we use a live connection for our reporting setup. This approach ensures consistency across multiple reports and datasets.

If you have any alternative recommendations that work within these constraints, I’d be happy to explore them!

lbendlin
Super User
Super User

Install DAX Studio.  Identify the visuals that load the slowest. Use Performance Analyzer to grab the DAX queries.  Evaluate those queries in DAX Studio. Examine the query plan and server timings. Refactor the queries. Go back to step 2.

Hi @lbendlin 

Thank you for the guidance. I had already used DAX Studio to analyze performance. Specifically, I identified that the measures for financial metrics like Gross IRR and PME, Net IRR and PME are the main contributors to the load time. When calculating differences (e.g., Previous - Current), these metrics add significant overhead, like 7 secs.

Here are some observations:

  • Each of the 4 complex metrics (Gross IRR, PME, etc.) has three variants: Previous, Current, and Change/Difference, which collectively take about 38 seconds to load.
  • Each individual set of calculations takes around 9 seconds.

I’ve attempted optimization within my knowledge, but I’m unsure if further improvements are possible. Since I’m relatively new to Power BI, I’d like to confirm if a total load time of 50 seconds is typical for reports of this scale and complexity or if there are additional steps I can take.

Additionally, I need to assure my team whether this performance is reasonable or if significant improvements are still achievable. Your insights on this would be greatly appreciated!


Additionally, is there any official assistance available for Power BI users in such situations to help identify and address performance issues? 

Anything above 5 seconds leads to bad user experience.  Continue learning about how to use DAX Studio to optimize your measures (SQLBI.com has tons of training videos on that topic). Refactor your queries so they use fewer resources.

Thank you for your feedback! I’ll definitely explore SQLBI.com.

I wanted to ask for clarification regarding backend solutions. For instance, in some applications, performance can be improved by adding resources like CPU, memory, or storage. In the case of Power BI, are performance issues typically resolved solely by query optimization, or is there scope to address them by adjusting capacity or increasing resources?

Currently, I’m using Fabric capacity on Power BI.

Depends on the connection type.  For import mode data sources your only option is to optimize the DAX code.  For Direct Query data sources you can also look at the SQL code generated by the queries and apply optimizations at the source (indexes, statistics etc).

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Jan25PBI_Carousel

Power BI Monthly Update - January 2025

Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.

Jan NL Carousel

Fabric Community Update - January 2025

Find out what's new and trending in the Fabric community.