Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hello, we would like to use incremental load (warehouse) in Data Factory Pipeline. Is it possible to use feature that allows you to "upsert" the new rows into existing tables instead of append/overwrite only?
Thank you
Can you please provide details about what your source and sink are?And you plan to use incremental update via Copy activity?
The question about increment is for warehouse. Specifically, we need to load only the new or updated records (for example, D-1 data) into the target table that already contains historical data. We would like to ensure that the pipeline handles this incremental logic effectively.
I am guessing you will need to have the copy activity write to a staging table and let a stored procedure perform the incremental load from the staging table into the final destination table.
Tbh I don't have experience with it myself.
Consider voting for these Ideas to highlight the need:
Data pipeline: UPSERT and DELETE
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=c63b2294-7e6c-ef11-a4e6-00224850867f
Support UPSERTs and DELETEs when copying data into Lakehouse Tables from Pipeline copy activity, as opposed to Appending new rows
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=87f3d168-6022-ee11-a81c-6045bdc01ce4
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 1 | |
| 1 | |
| 1 | |
| 1 | |
| 1 |
| User | Count |
|---|---|
| 4 | |
| 2 | |
| 2 | |
| 2 | |
| 1 |