Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Charlotte_chum
Frequent Visitor

How to do incremental refresh in data pipeline?

Hi all,

I would like to perform a 24-hour incremental refresh based on the last modified data. I found some details from online sources and Microsoft documentation, but I am unable to find specific information for data pipelines. Does anyone know how to do this?

1 ACCEPTED SOLUTION
v-karpurapud
Community Support
Community Support

Hi @Charlotte_chum 


Thank you for reaching out to the Microsoft Fabric Community Forum.
 

To implement a 24-hour incremental refresh in a Data Pipeline based on the last modified data, consider the below steps:

 

Make sure you have a source table with a LastModifiedTime column (datetime) to track changes, and a watermark table to store the last successfully processed timestamp.

 

Required a destination (e.g., Lakehouse or SQL DB) and access to Microsoft Fabric Data Factory to build and run the pipeline.

 

Create a pipeline parameter, e.g., LastRunTime, that will dynamically retrieve the last watermark value from watermark_table. Use the watermark in your SQL query to filter records modified in the last 24 hours.

 

Add a copy data activity in a pipeline, after successful data load, update watermark_table with the current UTC timestamp to record the latest successful run.

 

Use triggers in the Data Factory to schedule the pipeline to run every 24 hours.

 

For more detailed information, Please refer to the Microsoft official document:

Incrementally copy data from a source data store to a destination data store - Azure Data Factory | ...

 

If this response resolves your query, kindly mark it as Accepted Solution to help other community members. A Kudos is also appreciated if you found the response helpful.

 

Thank You!

 

View solution in original post

5 REPLIES 5
v-karpurapud
Community Support
Community Support

Hi @Charlotte_chum 

I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.

Thank you.

v-karpurapud
Community Support
Community Support

Hi @Charlotte_chum 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.

Thank you.

 

v-karpurapud
Community Support
Community Support

Hi @Charlotte_chum 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

v-karpurapud
Community Support
Community Support

Hi @Charlotte_chum 


Thank you for reaching out to the Microsoft Fabric Community Forum.
 

To implement a 24-hour incremental refresh in a Data Pipeline based on the last modified data, consider the below steps:

 

Make sure you have a source table with a LastModifiedTime column (datetime) to track changes, and a watermark table to store the last successfully processed timestamp.

 

Required a destination (e.g., Lakehouse or SQL DB) and access to Microsoft Fabric Data Factory to build and run the pipeline.

 

Create a pipeline parameter, e.g., LastRunTime, that will dynamically retrieve the last watermark value from watermark_table. Use the watermark in your SQL query to filter records modified in the last 24 hours.

 

Add a copy data activity in a pipeline, after successful data load, update watermark_table with the current UTC timestamp to record the latest successful run.

 

Use triggers in the Data Factory to schedule the pipeline to run every 24 hours.

 

For more detailed information, Please refer to the Microsoft official document:

Incrementally copy data from a source data store to a destination data store - Azure Data Factory | ...

 

If this response resolves your query, kindly mark it as Accepted Solution to help other community members. A Kudos is also appreciated if you found the response helpful.

 

Thank You!

 

NandanHegde
Super User
Super User

Can you be more specific what you mean by incremenatl refersh in data pipelines?
Do you mean incremental data sync from a source to sink via copy activity?




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

Top Solution Authors