Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowLearn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
Hi,
I have set up an incremental refresh on a report containing around 65 million rows of data. The increment is by one day which should correspond to around 50.000 rows of data.
In PBI desktop, I'm using the Azure Synapse Analytics workspace connector to connect to a delta table, not using specified partitions or Z-ordering of the data, but still auto-optimizing the file size, as I was unsure how this would affect the incremental refresh.
My problem is that, even though it's still faster with the incremental refresh than refreshing the entire dataset, it still takes 40-45 minutes which to me seems to be much too slow. Any advice on how I can improve on this would be much appreciated. Thank you.
Hi @johana_123 ,
According to your description, here is my suggestion.
Long incremental refresh time may be related to the query folding and the complexity of the model. If possible, consider optimizing the model to reduce the amount of data. And you can refer to the following document which may be helpful to you.
Troubleshoot incremental refresh and real-time data in Power BI - Power BI | Microsoft Learn
Or you can turn to "View"->"Performance analyzer" to help you identify visuals that are impacting the performance of your reports, and identify the reason for the impact.
Please refer to the document below.
What is Performance Analyzer in PowerBI? | by Susheel Aakulu | Medium
Best Regards,
Community Support Team _ xiaosun
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi,
And thank you for your answer. Unfortunately, I have already trimmed the data as much as possible - but should that really make a difference? Is it not only the new data consisting of around 50.000 rows that has to be loaded? Isn't that the point of an incremental refresh?
The report is not very complex, and I have already used DAX Studio to optimize on its performance. Also, I can see on the source side (on the SQL request sent to Synapse) that the sent query confirms the query folding, i.e., in the where statement I can see:
convert(datetime2, [_].[update_date]) as [t0_0] from [dbo].[my_table] as [_] ) as [_] where [_].[t0_0] >= convert(datetime2, '2022-11-03 00:00:00') and [_].[t0_0] <= convert(datetime2, '2022-11-04 00:00:00')
Added information:
I just did a manual refresh in the PBI service for confirming the query folding on the Synapse side. The refresh took less than 2 minutes compared to 25 this morning. On the Synapse side, the queries ran for less than 60 seconds both times. It seems to be caused by the PBI service? Could it be related to whether multiple PBI reports are refreshed at the same time in the same tenant, but other workspaces?
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.