Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
OneWithQuestion
Post Prodigy
Post Prodigy

How to optimize importing large data from SQL Server tables into Power BI / SSAS Tabular?

I have several data tables that are read that take a rather long time for the data to transfer.

 

The query that runs against the source table is not complex (Efflectively SELECT (bunch of columns) FROM TableName.

 

The slow down appears to be that the data being moved is a large amount of data and thus takes time to pull off disk.

 

Has anyone worked with this issue, where your Tabular refresh time is highly bottlenecked by how fast you can get the SQL data source to send you the data?

 

Does setting up a DW in SQL using in memory tables help with this, or any way to compress the data being transfered, pre-optimize the data before it is consumed by SSAS Tabular or Power BI, etc?

 

 

1 REPLY 1
TeigeGao
Solution Sage
Solution Sage

Hi OneWithQuestion,

The process importing data from SQL Server to PowerBI can be divided into three parts, firstly, SQL Server executes the query to get the result, then transfer the dataset to PowerBI Desktop, after that PowerBI desktop will generate the data model.

>> The slow down appears to be that the data being moved is a large amount of data and thus takes time to pull off disk.

Actually, only the first part involves moving data from disk to memory, we need first checking how long does SQL Server take to execute the SQL query, we can use the SQL Server Profiler to monitor how long it takes for a query. We can also run the query in SSMS and get the time: http://www.sqlserver.info/management-studio/show-query-execution-time/

After SQL Server getting the result, the dataset will be stored in memory, then transfer to PowerBI, this part will not take much time. When PowerBI get the dataset, it will generates the data model and compress the data. This process will take much time, please refer to the following blog: https://www.sqlbi.com/articles/data-import-best-practices-in-power-bi/ to get better performance.

>>Does setting up a DW in SQL using in memory tables help with this, or any way to compress the data being transfered, pre-optimize the data before it is consumed by SSAS Tabular or Power BI, etc?

Setting up in-memory table for SQL Server only improve the performance of query in SQL Server, if SQL Server takes long time to get the result, we can consider using it, if it doesn’t take much time, there is no need. To compress data being transferred, we can use the suitable data type for each column, we can create fact table and dimension table if all information are stored in a table.

Best Regards,

Teige

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors