Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
maziiw
Frequent Visitor

Improving performance for large data sets

I have a dashboard which is based on a large data set (approx 13 million rows).

The data is a daily view of transactions, and I cannot aggregate it further. Daily per transaction per user. Since 2023.

Data is then aggregated to weekly / monthly views.

 

PErformance of this page is very poor. Loading a page takes a long time, same as changing any filters or aggregation levels. 

What can you recommend trying that can improve the speed that the page is loading? I already made the table as slim as possible with fewer columns and restricted data to 2023.

2 ACCEPTED SOLUTIONS
Sandip_Palit
Resolver II
Resolver II

Here are the most effective strategies to significantly improve your dashboard's speed.

 

1. Implement a Star Schema Data Model
This is the single most important optimization you can make. Instead of having one large, wide table, you should structure your data into a star schema.

What it is: A central Fact Table (containing your numeric transaction data like Sales Amount, Quantity, and ID keys) surrounded by smaller Dimension Tables (containing descriptive attributes like User Details, Product Info, Calendar Dates, etc.).

Why it's faster:

Reduces Memory: The Power BI engine is highly optimized for this structure. It stores the repetitive text values from dimensions only once, dramatically reducing the file size and memory usage.

Faster Relationships: Queries become much more efficient as they operate on smaller dimension tables and leaner fact tables.

Action:

Identify repeating attribute columns in your main table (e.g., User Name, User Department, Product Category).

Create new tables for each distinct entity (e.g., a 'Users' table, a 'Products' table). Use Power Query to remove duplicates.

Replace the text columns in your fact table with integer ID keys that relate back to these new dimension tables.

Create a dedicated Calendar Table for all your date-based calculations instead of using Power BI's auto date/time.

 

2. Use Aggregation Tables
This is the most powerful technique for your specific scenario (aggregating daily data to weekly/monthly views). Aggregations allow Power BI to use a much smaller, pre-summarized table for high-level visuals and only query the massive 13-million-row detail table when a user drills down.

Action:

In Power Query, create a new query that references your main transaction table.

Use the Group By function to aggregate your data to the level of your visuals (e.g., group by Month, User ID, Product ID and sum the sales). This creates your aggregation table.

Load both the original detail table and this new aggregation table into your model.

In the Model view, right-click the aggregation table and select Manage aggregations. Map the summarized columns in your aggregation table to the corresponding columns in your detail table.

Power BI will now automatically use the small, fast aggregation table for monthly/weekly charts and seamlessly switch to the detail table only when necessary.

 

3. Write More Efficient DAX
Inefficient DAX measures can be a major bottleneck.

Use Variables (VAR): Always use variables to store calculations that are used multiple times within a single measure. This ensures the calculation is performed only once, not repeatedly.

Avoid Filtering Whole Tables: Instead of writing FILTER('YourTable', ...) use FILTER(ALL('YourTable'[Column]), ...) or KEEPFILTERS(). Filtering specific columns is much faster than filtering entire tables.

Be Careful with Iterators (X functions): Functions like SUMX and FILTER can be slow if they have to iterate over all 13 million rows. Try to perform calculations inside a CALCULATE filter context whenever possible, as this is more efficient.

 

4. Optimize the Report Page
Finally, analyze what's happening on the report page itself.

Use the Performance Analyzer: Go to the View tab and open the Performance Analyzer. Click "Start recording" and then refresh your visuals. It will show you exactly how many milliseconds each visual takes to load and which part (DAX Query, Visual Display) is the slowest. This helps you pinpoint the exact bottleneck.

Reduce Visuals: Every visual on a page sends at least one query. The fewer visuals you have, the faster the page will load.

Limit High-Cardinality Fields: Avoid using fields with thousands of unique values (like Transaction ID) in slicers or table visuals. This forces Power BI to load and render a huge amount of data.

Edit Interactions: Reduce unnecessary cross-filtering. By default, every visual filters every other visual. Go to Format > Edit interactions to turn off filtering for visuals that don't need to be linked, preventing a cascade of queries every time a user clicks.

 

If this explanation and solution resolve your issue, please like and accept the solution.

View solution in original post

Hi @maziiw,

Hope you had a chance to try out the solution shared earlier. Let us know if anything needs further clarification or if there's an update from your side always here to help.

Thank you.

View solution in original post

6 REPLIES 6
v-kpoloju-msft
Community Support
Community Support

Hi @maziiw,

Thank you for posting your query in Microsoft Fabric Community Forum. Also, thanks to @Aburar_123@Sandip_Palit,  for those inputs on this thread. Here I provided some official Microsoft links those might be helpful to resolve your thread more quickly.

The main performance issue here is due to Power BI having to scan and aggregate a very large volume of detailed records (daily-level transactions) on the fly every time a visual is rendered or a filter is applied. While you have already minimized columns and filtered the dataset to 2023, the size and granularity of the fact table are still putting a significant load on the model engine, especially when summarizing to weekly or monthly views.

Use Pre-Aggregated Tables: Create a second version of your dataset that is aggregated to the weekly or monthly level using Power Query or in your source system. You can then configure this as an aggregation table in Power BI to improve speed:
Optimization guide for Power BI - Power BI | Microsoft Learn

Simplify DAX Logic: Heavy measures using FILTER, CALCULATE, SUMX, or ALL on large datasets can slow things down. Where possible, move logic to Power Query or use lighter DAX alternatives. Instead of showing raw data on the main page, show summarized views (monthly/weekly totals), and let users drill through to daily or transactional detail only when needed.

Optimize the Data Model: Follow a star schema structure. Remove any bi-directional relationships unless necessary. Disable Auto Date/Time to reduce hidden overhead. Use this built-in tool (View > Performance Analyzer) to check which visuals or queries are taking the longest time. Use Performance Analyzer to examine report element performance in Power BI Desktop - Power BI | Micr...

Consider Incremental Refresh (if you're using Premium or PPU): This allows Power BI to only load new data rather than processing all 13M+ rows every refresh.
Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn

Also, please refer to the below mentioned documentation links for better understanding:
Understand star schema and the importance for Power BI - Power BI | Microsoft Learn
Troubleshoot report performance in Power BI - Power BI | Microsoft Learn

Hope this helps clarify things and let me know what you find after giving these steps a try happy to help you investigate this further.

Thank you for using the Microsoft Community Forum.

Hi @maziiw,

Just checking in to see if the issue has been resolved on your end. If the earlier suggestions helped, that’s great to hear! And if you’re still facing challenges, feel free to share more details happy to assist further.

Thank you.

Hi @maziiw,

Hope you had a chance to try out the solution shared earlier. Let us know if anything needs further clarification or if there's an update from your side always here to help.

Thank you.

Hi @maziiw,

Just wanted to follow up one last time. If the shared guidance worked for you, that’s wonderful hopefully it also helps others looking for similar answers. If there’s anything else you'd like to explore or clarify, don’t hesitate to reach out.

Thank you.

Aburar_123
Responsive Resident
Responsive Resident

Hi @maziiw ,

For Slicers, don't consume the columns directly from base table. Instead, you can write a separate query and maintain individual table for each slicer and create a relation with base table as start schema.

Example,

Category Slicer should come from Category table as below,

Dim_Category - Select distinct Category_id, Category from <base_table>

 

Start schema will improve the performance significantly.

 

Sandip_Palit
Resolver II
Resolver II

Here are the most effective strategies to significantly improve your dashboard's speed.

 

1. Implement a Star Schema Data Model
This is the single most important optimization you can make. Instead of having one large, wide table, you should structure your data into a star schema.

What it is: A central Fact Table (containing your numeric transaction data like Sales Amount, Quantity, and ID keys) surrounded by smaller Dimension Tables (containing descriptive attributes like User Details, Product Info, Calendar Dates, etc.).

Why it's faster:

Reduces Memory: The Power BI engine is highly optimized for this structure. It stores the repetitive text values from dimensions only once, dramatically reducing the file size and memory usage.

Faster Relationships: Queries become much more efficient as they operate on smaller dimension tables and leaner fact tables.

Action:

Identify repeating attribute columns in your main table (e.g., User Name, User Department, Product Category).

Create new tables for each distinct entity (e.g., a 'Users' table, a 'Products' table). Use Power Query to remove duplicates.

Replace the text columns in your fact table with integer ID keys that relate back to these new dimension tables.

Create a dedicated Calendar Table for all your date-based calculations instead of using Power BI's auto date/time.

 

2. Use Aggregation Tables
This is the most powerful technique for your specific scenario (aggregating daily data to weekly/monthly views). Aggregations allow Power BI to use a much smaller, pre-summarized table for high-level visuals and only query the massive 13-million-row detail table when a user drills down.

Action:

In Power Query, create a new query that references your main transaction table.

Use the Group By function to aggregate your data to the level of your visuals (e.g., group by Month, User ID, Product ID and sum the sales). This creates your aggregation table.

Load both the original detail table and this new aggregation table into your model.

In the Model view, right-click the aggregation table and select Manage aggregations. Map the summarized columns in your aggregation table to the corresponding columns in your detail table.

Power BI will now automatically use the small, fast aggregation table for monthly/weekly charts and seamlessly switch to the detail table only when necessary.

 

3. Write More Efficient DAX
Inefficient DAX measures can be a major bottleneck.

Use Variables (VAR): Always use variables to store calculations that are used multiple times within a single measure. This ensures the calculation is performed only once, not repeatedly.

Avoid Filtering Whole Tables: Instead of writing FILTER('YourTable', ...) use FILTER(ALL('YourTable'[Column]), ...) or KEEPFILTERS(). Filtering specific columns is much faster than filtering entire tables.

Be Careful with Iterators (X functions): Functions like SUMX and FILTER can be slow if they have to iterate over all 13 million rows. Try to perform calculations inside a CALCULATE filter context whenever possible, as this is more efficient.

 

4. Optimize the Report Page
Finally, analyze what's happening on the report page itself.

Use the Performance Analyzer: Go to the View tab and open the Performance Analyzer. Click "Start recording" and then refresh your visuals. It will show you exactly how many milliseconds each visual takes to load and which part (DAX Query, Visual Display) is the slowest. This helps you pinpoint the exact bottleneck.

Reduce Visuals: Every visual on a page sends at least one query. The fewer visuals you have, the faster the page will load.

Limit High-Cardinality Fields: Avoid using fields with thousands of unique values (like Transaction ID) in slicers or table visuals. This forces Power BI to load and render a huge amount of data.

Edit Interactions: Reduce unnecessary cross-filtering. By default, every visual filters every other visual. Go to Format > Edit interactions to turn off filtering for visuals that don't need to be linked, preventing a cascade of queries every time a user clicks.

 

If this explanation and solution resolve your issue, please like and accept the solution.

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.