Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
I'm struggling to set up an incremental refresh on a Pro-licence. I've published a dataset with approx. 4.7 million rows (and less than 10 columns), connected to an Azure SQL database. A full refresh of this dataset (in Power BI Service) takes about 20 minutes, and the CPU load at the sql server peaks at about 50-60% during refresh. Enabling incremental refresh leads to a refresh time above two hours, causing a timeout error. I know that first refresh will be slow when enabling incremental refresh, but a time factor beyond 6 was not what I expected...
I have of course verified that query folding is present, and I've had a close look at the troubleshooting article, without getting any further. Any suggestions on why this is failing?
Query settings:
let
Source = Sql.Database("***.database.windows.net", "integrationsdata", [CommandTimeout=#duration(0, 1, 30, 0)]),
must_vAggregatedStoppointDataDayHour = Source{[Schema="must",Item="vAggregatedStoppointDataDayHour"]}[Data],
#"Filtered Rows" = Table.SelectRows(must_vAggregatedStoppointDataDayHour, each [OperatingDate] >= RangeStart and [OperatingDate] <= RangeEnd)
in
#"Filtered Rows"
Incremental refresh settings:
Solved! Go to Solution.
"refreshing all data takes about 20 minutes. I suppose the refresh time for incremental refresh is significantly lower once the initial refresh is done?"
For this duration the savings will not be substantial. Don't forget that incremental refresh has an administrative overhead with the partition management and restructuring. You may be lucky to get a saving of 15 minutes, or maybe just 10 minutes.
This becomes much more useful once your full refresh would push against the two hour mark.
Can you please expain the reasoning behind doing an incremental refresh against an Azure data source? You're practically shuffling data from Azure to Azure.
Yes, I know I'm shuffling data from Azure to Azure, but the Power BI Report is in import mode, and refreshing all data takes about 20 minutes. I suppose the refresh time for incremental refresh is significantly lower once the initial refresh is done?
We could, of course, use a direct query and avoid moving data from Azure to Azure. However, we would like this report to remain in import mode (at least for this data source); the database is used for several different purposes ad we would like to avoid frequent requests from Power BI. Besides, the report is published to a web page, so we would like to keep the response time low.
I should also mention that half of my motivation was to learn how to set up incremental refresh in Power BI.
"refreshing all data takes about 20 minutes. I suppose the refresh time for incremental refresh is significantly lower once the initial refresh is done?"
For this duration the savings will not be substantial. Don't forget that incremental refresh has an administrative overhead with the partition management and restructuring. You may be lucky to get a saving of 15 minutes, or maybe just 10 minutes.
This becomes much more useful once your full refresh would push against the two hour mark.
Thank you, @lbendlin. Although I still haven't understood why the initial refresh takes that much time, your answer gives a good rule of thumb for when to use incremental refresh which I haven't seen any other place.
I found this documentation to be pretty good in describing the process:
https://docs.microsoft.com/en-us/power-query/dataflows/incremental-refresh
(ignore the dataflows part, the process is similar for dataset)
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.