Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowFabric Data Days Monthly is back. Join us on March 26th for two expert-led sessions on 1) Getting Started with Fabric IQ and 2) Mapping & Spacial Analytics in Fabric. Register now
Hello, we would like to use incremental load (warehouse) in Data Factory Pipeline. Is it possible to use feature that allows you to "upsert" the new rows into existing tables instead of append/overwrite only?
Thank you
Can you please provide details about what your source and sink are?And you plan to use incremental update via Copy activity?
The question about increment is for warehouse. Specifically, we need to load only the new or updated records (for example, D-1 data) into the target table that already contains historical data. We would like to ensure that the pipeline handles this incremental logic effectively.
I am guessing you will need to have the copy activity write to a staging table and let a stored procedure perform the incremental load from the staging table into the final destination table.
Tbh I don't have experience with it myself.
Consider voting for these Ideas to highlight the need:
Data pipeline: UPSERT and DELETE
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=c63b2294-7e6c-ef11-a4e6-00224850867f
Support UPSERTs and DELETEs when copying data into Lakehouse Tables from Pipeline copy activity, as opposed to Appending new rows
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=87f3d168-6022-ee11-a81c-6045bdc01ce4
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 3 | |
| 1 | |
| 1 | |
| 1 | |
| 1 |