Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
abehrmann
Helper II
Helper II

SQLServer performace issue .. will dataflows help

I have 5  reports using import method in production. 

 

These reports are set to refresh houry and are connecting to the same view in my azure sql database, which is about 35k rows

 

The query refresh times are horrible, roughly 30 mins.  

 

Would using a dataflow and having all reports us that as the single source improve the query time as it is connecting to one source? 

 

I am currently using DQ for many reports but these few need to remain Import because the Datetime format must be split to allow for the utilization of a date table.

 

 

 

1 ACCEPTED SOLUTION
v-xicai
Community Support
Community Support

Hi @abehrmann ,

 

Using dataflow may not improve the query time. When the refresh is in fact slow or crash, it can be due to several reasons:

  • Insufficient CPU (refresh can be very CPU-intensive).
  • Insufficient memory, resulting in refresh pausing (which requires the refresh to start over when conditions are favorable to recommence).
  • Non-capacity reasons, including data source system responsiveness, network latency, invalid permissions or gateway throughput. You may increase the capacity for workspaces to increase the model refresh parallelism frequency.
  • Data volume - a good reason to configure incremental refresh, Incremental refresh can significantly reduce data refresh duration, especially for large model tables. See more: Incremental refresh in Power BI Premium .

 

See more details: https://docs.microsoft.com/en-us/power-bi/whitepaper-powerbi-premium-deployment#why-are-refreshes-sl....

 

If you use DirectQuery mode to connect data, your report performance depends largely on the performance of the underlying data source.

 

You can optimize your data model using following tips:

 

  • Remove unused tables or columns, where possible. 
  • Avoid distinct counts on fields with high cardinality – that is, millions of distinct values.  
  • Take steps to avoid fields with unnecessary precision and high cardinality. For example, you could split highly unique datetime values into separate columns – for example, month, year, date, and so on. Or, where possible, use rounding on high-precision fields to lower cardinality – (for example, 13.29889 -> 13.3).
  • Use integers instead of strings, where possible.
  • Be wary of DAX functions, which need to test every row in a table – for example, RANKX – in the worst case, these functions can exponentially increase run-time and memory requirements given linear increases in table size.
  • When connecting to data sources via DirectQuery, consider indexing columns that are commonly filtered or sliced again. Indexing greatly improves report responsiveness.  
  • Enable Row-Level Security (RLS) where applicable.
  • Use Microsoft AppSource certified custom visuals where applicable.
  • Do not use hierarchical filters.
  • Provide data categorization for Power BI reports (HBI, MBI, LBI).
  • Use the On-premises data gateway instead of Personal Gateway.
  • Use slicers sparingly.

 

For more information on optimizing data sources for DirectQuery, see DirectQuery in SQL Server 2016 Analysis Services.

 

To minimize the impact of network latency, strive to keep data sources, gateways, and your Power BI cluster as close as possible. If network latency is an issue, try locating gateways and data sources closer to your Power BI cluster by placing them on virtual machines.

To further improve network latency, consider using Azure ExpressRoute, which is able of creating faster, more reliable network connections between your clients and Azure datacenters.

 

You can learn more via the link: https://docs.microsoft.com/en-us/power-bi/power-bi-reports-performance#optimize-your-model.

 

Best Regards,

Amy

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

View solution in original post

1 REPLY 1
v-xicai
Community Support
Community Support

Hi @abehrmann ,

 

Using dataflow may not improve the query time. When the refresh is in fact slow or crash, it can be due to several reasons:

  • Insufficient CPU (refresh can be very CPU-intensive).
  • Insufficient memory, resulting in refresh pausing (which requires the refresh to start over when conditions are favorable to recommence).
  • Non-capacity reasons, including data source system responsiveness, network latency, invalid permissions or gateway throughput. You may increase the capacity for workspaces to increase the model refresh parallelism frequency.
  • Data volume - a good reason to configure incremental refresh, Incremental refresh can significantly reduce data refresh duration, especially for large model tables. See more: Incremental refresh in Power BI Premium .

 

See more details: https://docs.microsoft.com/en-us/power-bi/whitepaper-powerbi-premium-deployment#why-are-refreshes-sl....

 

If you use DirectQuery mode to connect data, your report performance depends largely on the performance of the underlying data source.

 

You can optimize your data model using following tips:

 

  • Remove unused tables or columns, where possible. 
  • Avoid distinct counts on fields with high cardinality – that is, millions of distinct values.  
  • Take steps to avoid fields with unnecessary precision and high cardinality. For example, you could split highly unique datetime values into separate columns – for example, month, year, date, and so on. Or, where possible, use rounding on high-precision fields to lower cardinality – (for example, 13.29889 -> 13.3).
  • Use integers instead of strings, where possible.
  • Be wary of DAX functions, which need to test every row in a table – for example, RANKX – in the worst case, these functions can exponentially increase run-time and memory requirements given linear increases in table size.
  • When connecting to data sources via DirectQuery, consider indexing columns that are commonly filtered or sliced again. Indexing greatly improves report responsiveness.  
  • Enable Row-Level Security (RLS) where applicable.
  • Use Microsoft AppSource certified custom visuals where applicable.
  • Do not use hierarchical filters.
  • Provide data categorization for Power BI reports (HBI, MBI, LBI).
  • Use the On-premises data gateway instead of Personal Gateway.
  • Use slicers sparingly.

 

For more information on optimizing data sources for DirectQuery, see DirectQuery in SQL Server 2016 Analysis Services.

 

To minimize the impact of network latency, strive to keep data sources, gateways, and your Power BI cluster as close as possible. If network latency is an issue, try locating gateways and data sources closer to your Power BI cluster by placing them on virtual machines.

To further improve network latency, consider using Azure ExpressRoute, which is able of creating faster, more reliable network connections between your clients and Azure datacenters.

 

You can learn more via the link: https://docs.microsoft.com/en-us/power-bi/power-bi-reports-performance#optimize-your-model.

 

Best Regards,

Amy

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors