Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
amilpbi
Frequent Visitor

Handling Big Data in Power BI

I have a SQL Server on premises database containing 35+ tables representating a State which needs to be visualised at once, 
I am using union all to join all tables and now all 45 million records have been appened into one table.
I am using Direct Query because Import failed at 15-16 million records.
Simple visuals are taking so much time to slice or filter. 
Is there any way to solve this issue? 

1 ACCEPTED SOLUTION
WishAskedSooner
Responsive Resident
Responsive Resident

PowerBI is capable of handling billions of rows. So, if your Data Model is choking on 15-16 millions rows, you need to have a close look at the Data Model architecture. Joining 35 tables into a single table is definitely NOT best practice. As a rule, Fact tables should be normalized while dimension tables should be de-normalized. Star schemas are your friend.

 

All this takes work of course. That is why Data Modeling is so important. Get your Data Model right, and you will spend very little time worrying about anything other than building out your reports.

View solution in original post

5 REPLIES 5
suparnababu8
Super User
Super User

Hi @amilpbi 

 

I aggree with Mr @WishAskedSooner and Mr @Kedar_Pande .

Handling large datasets in SQL Server and optimizing Direct Query performance can be challenging. Here are some suggestions to improve the performance of your visualizations:

 

1. Optimize Queries:
Indexing: Ensure that your tables are properly indexed, especially on columns used in joins and filters. This can significantly speed up query performance.

2.Data Aggregation:
- Pre-Aggregate Data: Instead of querying raw data, create summary tables that aggregate data at a higher level. This reduces the amount of data processed in real-time.
-Partitioning: Partition your large table into smaller, more manageable pieces. This can improve query performance by reducing the amount of data scanned.


3.Direct Query Optimization:
- Optimize Ribbon in Power BI: Use the Optimize ribbon to control when visuals refresh. This can help manage performance by pausing and refreshing visuals as needed.

4.Alternative Approaches:
- Hybrid Approach: Combine Direct Query with Import mode. Use Import for smaller, frequently accessed data and Direct Query for larger, less frequently accessed data.
- Data Warehousing: Consider using a data warehouse solution that is optimized for large-scale data processing and can handle complex queries more efficiently.

 

Still if you need additional info pls go through  below URLs

https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-optimize-ribbon-scenarios

https://learn.microsoft.com/en-us/troubleshoot/sql/database-engine/performance/troubleshoot-slow-run... 

 

Thanks!

WishAskedSooner
Responsive Resident
Responsive Resident

PowerBI is capable of handling billions of rows. So, if your Data Model is choking on 15-16 millions rows, you need to have a close look at the Data Model architecture. Joining 35 tables into a single table is definitely NOT best practice. As a rule, Fact tables should be normalized while dimension tables should be de-normalized. Star schemas are your friend.

 

All this takes work of course. That is why Data Modeling is so important. Get your Data Model right, and you will spend very little time worrying about anything other than building out your reports.

I see you are just a troll

It is client's data, each table is supposed to be data for a State in a country, given the columns are same, they append easily. I dont think client is going to change structure of his database just to fit my requirements.
I'm beginner to Power BI so everything  helps

Kedar_Pande
Super User
Super User

@amilpbi 

You can try:

  1. Create summarized tables in SQL Server for common views (e.g., aggregate at monthly or regional level), reducing data rows sent to Power BI.
  2. Use pre-defined SQL views or stored procedures in SQL Server to only bring necessary columns and filtered rows.
  3. Use Dual Mode (Import + DirectQuery) for frequently-used tables, importing only high-level aggregated data and keeping detail tables in DirectQuery.
  4. Limit visuals on each page, avoid complex visuals (e.g., heavy use of slicers and filters), and use filters to pre-limit data.
  5. If you can switch some tables to Import, use Incremental Refresh to manage dataset size.

These techniques reduce query time and improve interactivity.

 

💌 If this helped, a Kudos 👍 or Solution mark would be great! 🎉
Cheers,
Kedar
Connect on LinkedIn

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors