- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures
I’m seeking advice on improving the performance of my Power BI reports, which currently take 30 to 50 seconds to load or refresh. Here's the detailed context of my setup:
Report Details:
- The report contains 40 columns:
- 12 metrics, each with 4 associated columns (e.g., Prior Value, Current Value, Change in $, Change in %).
- Example: For the metric "Committed," we calculate the prior value (min of selected period), current value (max of selected period), and their difference in both $ and %.
- Most calculations are straightforward, except for 4 metrics (e.g., Gross IRR, PME, Net IRR), which involve 40–50 lines of DAX code with complex logic.
- It uses a Matrix visual with:
- 4 row-level groupings.
- 60 rows of data displayed in the Matrix.
Dataset Details:
- The report connects live to a shared semantic model hosted on the Power BI Service:
- Dataset size: 1.75 GB.
- It includes 37 tables and 140 measures.
- The data model is common across many reports, with multiple reports taking 30–50 seconds to load.
Key Observations:
- When I connect the report to a local dataset (imported version), performance improves slightly.
- Converting the Matrix visual to a Table visual reduces the load time by ~15 seconds, but I cannot change this visual for our use case.
- Data volume will continue to grow over time, and optimizing measures further is not feasible at the moment.
Questions:
- Given the size of the dataset (1.75 GB) and the complexity of the measures, is the current performance (30–50 seconds) reasonable? Is it acceptable to have this load time for reports in General ?
- Is there room for significant improvement (e.g., reducing load time to 1–5 seconds) without altering the core dataset or report structure?
- Are there capacity or memory-related adjustments I can explore to improve performance, such as better leveraging Power BI Service capacities?
- Are there any modeling or visualization best practices for handling such large datasets in live connection scenarios?
Your guidance on how to address these challenges would be invaluable. Thank you in advance for your insights!
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @MicrosoftMaster,
I think this should be a normal loading time based on your report data amount, calculations and design.
Is it possible to reduce the stored data amount and calculation fields? Or you can try to switch to live connection mode and put the Dax expressions to the database side?
Regards,
Xiaoxin Sheng
If this post helps, please consider accept as solution to help other members find it more quickly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Thank you for your suggestion. Unfortunately, I’m unable to reduce the stored data amount or move the DAX expressions to the database side. As per our team's procedures, all measures must remain in the dataset, and we use a live connection for our reporting setup. This approach ensures consistency across multiple reports and datasets.
If you have any alternative recommendations that work within these constraints, I’d be happy to explore them!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Anything above 5 seconds leads to bad user experience. Continue learning about how to use DAX Studio to optimize your measures (SQLBI.com has tons of training videos on that topic). Refactor your queries so they use fewer resources.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @MicrosoftMaster,
I think this should be a normal loading time based on your report data amount, calculations and design.
Is it possible to reduce the stored data amount and calculation fields? Or you can try to switch to live connection mode and put the Dax expressions to the database side?
Regards,
Xiaoxin Sheng
If this post helps, please consider accept as solution to help other members find it more quickly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Thank you for your suggestion. Unfortunately, I’m unable to reduce the stored data amount or move the DAX expressions to the database side. As per our team's procedures, all measures must remain in the dataset, and we use a live connection for our reporting setup. This approach ensures consistency across multiple reports and datasets.
If you have any alternative recommendations that work within these constraints, I’d be happy to explore them!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Install DAX Studio. Identify the visuals that load the slowest. Use Performance Analyzer to grab the DAX queries. Evaluate those queries in DAX Studio. Examine the query plan and server timings. Refactor the queries. Go back to step 2.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @lbendlin
Thank you for the guidance. I had already used DAX Studio to analyze performance. Specifically, I identified that the measures for financial metrics like Gross IRR and PME, Net IRR and PME are the main contributors to the load time. When calculating differences (e.g., Previous - Current), these metrics add significant overhead, like 7 secs.
Here are some observations:
- Each of the 4 complex metrics (Gross IRR, PME, etc.) has three variants: Previous, Current, and Change/Difference, which collectively take about 38 seconds to load.
- Each individual set of calculations takes around 9 seconds.
I’ve attempted optimization within my knowledge, but I’m unsure if further improvements are possible. Since I’m relatively new to Power BI, I’d like to confirm if a total load time of 50 seconds is typical for reports of this scale and complexity or if there are additional steps I can take.
Additionally, I need to assure my team whether this performance is reasonable or if significant improvements are still achievable. Your insights on this would be greatly appreciated!
Additionally, is there any official assistance available for Power BI users in such situations to help identify and address performance issues?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Anything above 5 seconds leads to bad user experience. Continue learning about how to use DAX Studio to optimize your measures (SQLBI.com has tons of training videos on that topic). Refactor your queries so they use fewer resources.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you for your feedback! I’ll definitely explore SQLBI.com.
I wanted to ask for clarification regarding backend solutions. For instance, in some applications, performance can be improved by adding resources like CPU, memory, or storage. In the case of Power BI, are performance issues typically resolved solely by query optimization, or is there scope to address them by adjusting capacity or increasing resources?
Currently, I’m using Fabric capacity on Power BI.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Depends on the connection type. For import mode data sources your only option is to optimize the DAX code. For Direct Query data sources you can also look at the SQL code generated by the queries and apply optimizations at the source (indexes, statistics etc).

Helpful resources
Subject | Author | Posted | |
---|---|---|---|
06-26-2024 05:52 PM | |||
04-10-2023 10:56 AM | |||
03-29-2024 02:35 AM | |||
03-16-2021 05:49 PM | |||
Anonymous
| 04-08-2024 05:01 AM |