<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures in Developer</title>
    <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4354012#M58920</link>
    <description>&lt;P&gt;Install DAX Studio.&amp;nbsp; Identify the visuals that load the slowest. Use Performance Analyzer to grab the DAX queries.&amp;nbsp; Evaluate those queries in DAX Studio. Examine the query plan and server timings. Refactor the queries. Go back to step 2.&lt;/P&gt;</description>
    <pubDate>Tue, 07 Jan 2025 21:14:07 GMT</pubDate>
    <dc:creator>lbendlin</dc:creator>
    <dc:date>2025-01-07T21:14:07Z</dc:date>
    <item>
      <title>Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures</title>
      <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4353688#M58915</link>
      <description>&lt;P&gt;I’m seeking advice on improving the performance of my Power BI reports, which currently take &lt;STRONG&gt;30 to 50 seconds&lt;/STRONG&gt; to load or refresh. Here's the detailed context of my setup:&lt;/P&gt;&lt;HR /&gt;&lt;H4&gt;&lt;STRONG&gt;Report Details&lt;/STRONG&gt;:&lt;/H4&gt;&lt;UL&gt;&lt;LI&gt;The report contains &lt;STRONG&gt;40 columns&lt;/STRONG&gt;:&lt;UL&gt;&lt;LI&gt;&lt;STRONG&gt;12 metrics&lt;/STRONG&gt;, each with &lt;STRONG&gt;4 associated columns&lt;/STRONG&gt; (e.g., Prior Value, Current Value, Change in $, Change in %).&lt;/LI&gt;&lt;LI&gt;Example: For the metric "Committed," we calculate the prior value (min of selected period), current value (max of selected period), and their difference in both $ and %.&lt;/LI&gt;&lt;LI&gt;Most calculations are straightforward, except for 4 metrics (e.g., &lt;STRONG&gt;Gross IRR&lt;/STRONG&gt;, &lt;STRONG&gt;PME&lt;/STRONG&gt;, &lt;STRONG&gt;Net IRR&lt;/STRONG&gt;), which involve &lt;STRONG&gt;40–50 lines of DAX code&lt;/STRONG&gt; with complex logic.&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;It uses a &lt;STRONG&gt;Matrix visual&lt;/STRONG&gt; with:&lt;UL&gt;&lt;LI&gt;&lt;STRONG&gt;4 row-level groupings&lt;/STRONG&gt;.&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;60 rows of data&lt;/STRONG&gt; displayed in the Matrix.&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;HR /&gt;&lt;H4&gt;&lt;STRONG&gt;Dataset Details&lt;/STRONG&gt;:&lt;/H4&gt;&lt;UL&gt;&lt;LI&gt;The report connects live to a shared &lt;STRONG&gt;semantic model hosted on the Power BI Service&lt;/STRONG&gt;:&lt;UL&gt;&lt;LI&gt;Dataset size: &lt;STRONG&gt;1.75 GB&lt;/STRONG&gt;.&lt;/LI&gt;&lt;LI&gt;It includes &lt;STRONG&gt;37 tables&lt;/STRONG&gt; and &lt;STRONG&gt;140 measures&lt;/STRONG&gt;.&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;The data model is common across many reports, with multiple reports taking &lt;STRONG&gt;30–50 seconds&lt;/STRONG&gt; to load.&lt;/LI&gt;&lt;/UL&gt;&lt;HR /&gt;&lt;H4&gt;&lt;STRONG&gt;Key Observations&lt;/STRONG&gt;:&lt;/H4&gt;&lt;OL&gt;&lt;LI&gt;When I &lt;STRONG&gt;connect the report to a local dataset&lt;/STRONG&gt; (imported version), performance improves slightly.&lt;/LI&gt;&lt;LI&gt;Converting the &lt;STRONG&gt;Matrix visual to a Table visual&lt;/STRONG&gt; reduces the load time by ~15 seconds, but I cannot change this visual for our use case.&lt;/LI&gt;&lt;LI&gt;Data volume will continue to grow over time, and optimizing measures further is not feasible at the moment.&lt;/LI&gt;&lt;/OL&gt;&lt;HR /&gt;&lt;H3&gt;&lt;STRONG&gt;Questions&lt;/STRONG&gt;:&lt;/H3&gt;&lt;OL&gt;&lt;LI&gt;Given the size of the dataset (1.75 GB) and the complexity of the measures, is the current performance (30–50 seconds) reasonable? Is it acceptable to have this load time for reports in General ?&lt;/LI&gt;&lt;LI&gt;Is there room for significant improvement (e.g., reducing load time to &lt;STRONG&gt;1–5 seconds&lt;/STRONG&gt;) without altering the core dataset or report structure?&lt;/LI&gt;&lt;LI&gt;Are there &lt;STRONG&gt;capacity or memory-related adjustments&lt;/STRONG&gt; I can explore to improve performance, such as better leveraging Power BI Service capacities?&lt;/LI&gt;&lt;LI&gt;Are there any &lt;STRONG&gt;modeling or visualization best practices&lt;/STRONG&gt; for handling such large datasets in live connection scenarios?&lt;/LI&gt;&lt;/OL&gt;&lt;HR /&gt;&lt;P&gt;Your guidance on how to address these challenges would be invaluable. Thank you in advance for your insights!&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jan 2025 15:52:11 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4353688#M58915</guid>
      <dc:creator>MicrosoftMaster</dc:creator>
      <dc:date>2025-01-07T15:52:11Z</dc:date>
    </item>
    <item>
      <title>Re: Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures</title>
      <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4354012#M58920</link>
      <description>&lt;P&gt;Install DAX Studio.&amp;nbsp; Identify the visuals that load the slowest. Use Performance Analyzer to grab the DAX queries.&amp;nbsp; Evaluate those queries in DAX Studio. Examine the query plan and server timings. Refactor the queries. Go back to step 2.&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jan 2025 21:14:07 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4354012#M58920</guid>
      <dc:creator>lbendlin</dc:creator>
      <dc:date>2025-01-07T21:14:07Z</dc:date>
    </item>
    <item>
      <title>Re: Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures</title>
      <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4354647#M58925</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/519953"&gt;@MicrosoftMaster&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;I think this should be a normal loading time based on your report data amount, calculations and design.&lt;BR /&gt;Is it possible to reduce the stored data amount and calculation fields? Or you can try to switch to live connection mode and put the Dax expressions to the database side?&lt;/P&gt;
&lt;P&gt;Regards,&lt;/P&gt;
&lt;P&gt;Xiaoxin Sheng&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 07:22:17 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4354647#M58925</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2025-01-08T07:22:17Z</dc:date>
    </item>
    <item>
      <title>Re: Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures</title>
      <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355515#M58933</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Thank you for your suggestion. Unfortunately, I’m unable to reduce the stored data amount or move the DAX expressions to the database side. As per our team's procedures, all measures must remain in the dataset, and we use a live connection for our reporting setup. This approach ensures consistency across multiple reports and datasets.&lt;/P&gt;&lt;P&gt;If you have any alternative recommendations that work within these constraints, I’d be happy to explore them!&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 15:52:33 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355515#M58933</guid>
      <dc:creator>MicrosoftMaster</dc:creator>
      <dc:date>2025-01-08T15:52:33Z</dc:date>
    </item>
    <item>
      <title>Re: Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures</title>
      <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355526#M58934</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/100342"&gt;@lbendlin&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you for the guidance. I had already used DAX Studio to analyze performance. Specifically, I identified that the measures for financial metrics like &lt;STRONG&gt;Gross IRR&lt;/STRONG&gt; and &lt;STRONG&gt;PME, Net IRR and PME&amp;nbsp;&lt;/STRONG&gt;are the main contributors to the load time. &lt;STRONG&gt;When calculating differences (e.g., Previous - Current), these metrics add significant overhead, like 7 secs.&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Here are some observations:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Each of the 4 complex metrics (Gross IRR, PME, etc.) has three variants: Previous, Current, and Change/Difference, which collectively take about &lt;STRONG&gt;38 seconds&lt;/STRONG&gt; to load.&lt;/LI&gt;&lt;LI&gt;Each individual set of calculations takes around &lt;STRONG&gt;9 seconds.&lt;/STRONG&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;I’ve attempted optimization within my knowledge, but I’m unsure if further improvements are possible. Since I’m relatively new to Power BI, I’d like to confirm if a total load time of 50 seconds is typical for reports of this scale and complexity or if there are additional steps I can take.&lt;/P&gt;&lt;P&gt;Additionally, I need to assure my team whether this performance is reasonable or if significant improvements are still achievable. Your insights on this would be greatly appreciated!&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;Additionally, is there any official assistance available for Power BI users in such situations to help identify and address performance issues?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 15:58:41 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355526#M58934</guid>
      <dc:creator>MicrosoftMaster</dc:creator>
      <dc:date>2025-01-08T15:58:41Z</dc:date>
    </item>
    <item>
      <title>Re: Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures</title>
      <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355616#M58936</link>
      <description>&lt;P&gt;Anything above 5 seconds leads to bad user experience.&amp;nbsp; Continue learning about how to use DAX Studio to optimize your measures (SQLBI.com has tons of training videos on that topic). Refactor your queries so they use fewer resources.&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 16:55:22 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355616#M58936</guid>
      <dc:creator>lbendlin</dc:creator>
      <dc:date>2025-01-08T16:55:22Z</dc:date>
    </item>
    <item>
      <title>Re: Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures</title>
      <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355832#M58940</link>
      <description>&lt;P&gt;Thank you for your feedback! I’ll definitely explore SQLBI.com.&lt;/P&gt;&lt;P&gt;I wanted to ask for clarification regarding backend solutions. For instance, in some applications, performance can be improved by adding resources like CPU, memory, or storage. In the case of Power BI, are performance issues typically resolved solely by query optimization, or is there scope to address them by adjusting capacity or increasing resources?&lt;/P&gt;&lt;P&gt;Currently, I’m using &lt;STRONG&gt;Fabric capacity&lt;/STRONG&gt; on Power BI.&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 20:51:13 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355832#M58940</guid>
      <dc:creator>MicrosoftMaster</dc:creator>
      <dc:date>2025-01-08T20:51:13Z</dc:date>
    </item>
    <item>
      <title>Re: Optimizing Performance for Power BI Reports with Large Datasets and Complex Measures</title>
      <link>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355834#M58941</link>
      <description>&lt;P&gt;Depends on the connection type.&amp;nbsp; For import mode data sources your only option is to optimize the DAX code.&amp;nbsp; For Direct Query data sources you can also look at the SQL code generated by the queries and apply optimizations at the source (indexes, statistics etc).&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 20:57:16 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Developer/Optimizing-Performance-for-Power-BI-Reports-with-Large-Datasets/m-p/4355834#M58941</guid>
      <dc:creator>lbendlin</dc:creator>
      <dc:date>2025-01-08T20:57:16Z</dc:date>
    </item>
  </channel>
</rss>

