Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreThe FabCon + SQLCon recap series starts April 14th at 8am Pacific. If you’re tracking where AI is going inside Fabric, this first session is a can't miss. Register now
Hi,
I’ve configured incremental refresh on a Power BI dataset.
The source is SQL views, and the underlying tables contain both Created and Modified date columns.
In Power Query, I applied the incremental range filters using the Created date (RangeStart and RangeEnd).
In the incremental refresh settings, I configured it to store 5 years and refresh the last 2 days, and I enabled “Detect data changes” based on the Modified date column.
The issue:
When a record from 2021 gets its Modified date updated, the change is not picked up by the incremental refresh, and the updated data does not appear in the dataset.
Can someone explain why this is happening and how I should correctly configure incremental refresh + detect data changes so that updates to older records (e.g., from 2021/2022 etc) are captured?
Regards,
Adeel Nazir
Solved! Go to Solution.
Hi @automation ,
After you publish the PBIX file, the first load will involve a complete refresh, meaning that all partitions will be processed. Please verify whether the size of the semantic model exceeds the capacity limits.
To overcome the size issue you can follow the below approach :
define a parameter as LoadAllData
In Power Query step
Let .... SampleData = Table.FirstN(Source, 10), Check = if LoadAllData then Source else SampleData in Check
Keep the default value as False and publish the model to service and do the first refresh .
The first refresh will take care of creating all the partitions in the servivce , now set the LoadAllData to True and process the partitions one by one using ssms or fabric notebook.
Thanks
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster
Hi @automation,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi @automation,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi All,
I need some assistance with this issue. Can someone please help me and find a resolution?
Regards,
Adeel Nazir
Hi @automation,
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster.
Connect with me on LinkedIn
Hi, @automation
You are correctly configuring the incremental refresh by using the created datetime to set the RangeStart and RangeEnd, and using the modified datetime to detect changes.
Can you confirm whether the Modified field is in datetime format?
Check in your Power Query step whether that row is getting filtered unintentionally
Additionally, I recommend configuring the refresh by months instead of years. This way, if a row is updated, the process will only refresh the data for that particular month rather than the entire year. This approach helps isolate the refresh and improves efficiency.
reference video : https://www.youtube.com/watch?v=JsJWBr1_ktQ
Thanks
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster
Hi @Natarajan_M ,
Thank you for your detailed response to my question I really appreciate it.
I’ve confirmed that the Modified date is stored as a DATETIME field.
Questions:
I'm currently configured for "last 2 days data refresh" while keeping 5 years of historical data.
If I change the historical data range to 60 months, should I also adjust the "last 2 days refresh" value accordingly?
As shown in the screenshots attached to my original post, does the "last 2 days refresh only" setting explain why 2021 records with updated Modified dates are not being picked up by Detect Data Changes?
Hi @automation , The "Detect Data Changes" feature operates within the refresh context window. It retains historical data for up to 60 months and allows for incremental data refresh for the same duration. With this setting, all your partitions fall within the "Detect Data Changes" window, enabling the system to identify changes in the data. Consequently, not all partitions will be processed until a change occurs within those partitions.
thanks ,
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster
Hi @Natarajan_M ,
I've implemented a last 60 months incremental refresh with data detect changes on my dataset. However, with Power BI Premium Per User (PPU) license, the dataset is not refreshing completely and I'm encountering timeout errors.
Could you please help me resolve this issue?
Thanks!
Hi @automation ,
After you publish the PBIX file, the first load will involve a complete refresh, meaning that all partitions will be processed. Please verify whether the size of the semantic model exceeds the capacity limits.
To overcome the size issue you can follow the below approach :
define a parameter as LoadAllData
In Power Query step
Let .... SampleData = Table.FirstN(Source, 10), Check = if LoadAllData then Source else SampleData in Check
Keep the default value as False and publish the model to service and do the first refresh .
The first refresh will take care of creating all the partitions in the servivce , now set the LoadAllData to True and process the partitions one by one using ssms or fabric notebook.
Thanks
If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster
Hi @automation,
This behavior is expected when using a large incremental refresh window with Detect data changes on a Power BI Premium Per User (PPU) license. With a last 60 months refresh policy enabled, the service must evaluate every partition within that window during each refresh to check for changes. Even though Detect data changes helps avoid unnecessary reloads, Power BI still issues polling queries per partition, and if many partitions are involved, this can quickly exceed the execution and resource limits of PPU, resulting in refresh timeouts.
It’s also important to note that Detect data changes operates at the partition level, not at the individual row level. If a single row changes within a large partition, the entire partition needs to be reprocessed, which further increases refresh cost and duration. This becomes especially noticeable when partitions span long periods such as months or years.
To mitigate this, you can try reducing the incremental refresh window (for example, from 60 months to a smaller range that aligns with how often historical data actually changes), and ensure that the column used for RangeStart/RangeEnd and Detect data changes supports full query folding back to the source. Proper partitioning and folding are critical to keeping refreshes efficient.
Thanks,
prashanth
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 52 | |
| 38 | |
| 37 | |
| 19 | |
| 18 |
| User | Count |
|---|---|
| 67 | |
| 67 | |
| 34 | |
| 32 | |
| 29 |