Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hello Everyone,
I am currently working on a report in DirectQuery mode, connected to a SQL server (Relational) with a large amount of data. Upon completing the report, each page takes between 3 to 7 seconds to open, which I understand is due to the substantial volume of data.
I have made several optimizations: removing unnecessary columns, creating SQL queries that process all the data instead of relying on Power BI to do so, and adding indexes to the table. However, I have not been able to significantly improve the response time of each report.
My questions are as follows:
If I switch the data to Import mode or some of the data to Import mode, could it potentially improve the report's performance?
If I set it to Import mode and select a 2-hour refresh schedule in the Power BI service, will the data in the Import tables be updated?
What recommendations do you have to enhance the response time of the report?
Regards.
Hi, if Direct Query is important to you, I would recommend you switch to in-memory DWH. We did it at my company, as we realized that SQL Server is not fast enough for this purpose. We tried several in-memory DBs and decided on Exasol. The good thing about it was that we could use Exasol as an acceleration layer on the top of SQL Server in the beginning. After some time, we were happy with the performance and decided completely to replace SQL server with Exasol. However, if import and regular refresh is an option for you and you do not have a large dataset, then you probably do not need an in-memory DB.
Hi @NoDataNoParty
follow below steps
Optimizing performance in Power BI, especially with large datasets, is a common challenge.
Switching to Import Mode:
Scheduled Refresh for Import Tables:
Recommendations to Enhance Response Time:
Data Modeling: Pay careful attention to data modeling in Power BI. Ensure that relationships between tables are properly defined and optimized. Avoid many-to-many relationships and excessive use of calculated columns or measures that might slow down performance.
Aggregations: For large datasets, consider creating and using aggregations. Aggregations store pre-aggregated data at a higher granularity, allowing for faster query performance while still retaining detailed data when needed.
Partitioning: If your source data supports it, consider partitioning large tables to reduce the amount of data scanned for each query.
Indexes: Continue optimizing your SQL server by adding appropriate indexes on columns used frequently in filtering and joining operations.
Summary Tables: Precompute and store summary tables that contain aggregated data. These can significantly speed up query performance.
Query Folding: Make sure that your SQL queries issued by Power BI are designed to take advantage of query folding. This means pushing as much of the computation back to the source database as possible.
Performance Analyzer: Use the Performance Analyzer tool in Power BI Desktop to identify bottlenecks and areas that need optimization in your report.
Limit Visuals: Avoid using too many visuals on a single page. Limit the number of visuals and rows displayed in each visual to keep performance manageable.
Scheduled Refresh: Set an appropriate refresh schedule based on the frequency of data updates. Frequent data refreshes can ensure that your reports are always up-to-date.
Use DirectQuery for Real-Time Data: If your data needs to be real-time and Import mode doesn't meet your requirements, consider using DirectQuery, but be prepared for potential performance trade-offs.
Remember that the best approach may involve a combination of techniques, such as Import mode for frequently accessed data and DirectQuery for less frequently accessed data.