Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hello, we would like to use incremental load (warehouse) in Data Factory Pipeline. Is it possible to use feature that allows you to "upsert" the new rows into existing tables instead of append/overwrite only?
Thank you
Can you please provide details about what your source and sink are?And you plan to use incremental update via Copy activity?
The question about increment is for warehouse. Specifically, we need to load only the new or updated records (for example, D-1 data) into the target table that already contains historical data. We would like to ensure that the pipeline handles this incremental logic effectively.
I am guessing you will need to have the copy activity write to a staging table and let a stored procedure perform the incremental load from the staging table into the final destination table.
Tbh I don't have experience with it myself.
Consider voting for these Ideas to highlight the need:
Data pipeline: UPSERT and DELETE
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=c63b2294-7e6c-ef11-a4e6-00224850867f
Support UPSERTs and DELETEs when copying data into Lakehouse Tables from Pipeline copy activity, as opposed to Appending new rows
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=87f3d168-6022-ee11-a81c-6045bdc01ce4
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.