The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hello, we would like to use incremental load (warehouse) in Data Factory Pipeline. Is it possible to use feature that allows you to "upsert" the new rows into existing tables instead of append/overwrite only?
Thank you
Can you please provide details about what your source and sink are?And you plan to use incremental update via Copy activity?
The question about increment is for warehouse. Specifically, we need to load only the new or updated records (for example, D-1 data) into the target table that already contains historical data. We would like to ensure that the pipeline handles this incremental logic effectively.
I am guessing you will need to have the copy activity write to a staging table and let a stored procedure perform the incremental load from the staging table into the final destination table.
Tbh I don't have experience with it myself.
Consider voting for these Ideas to highlight the need:
Data pipeline: UPSERT and DELETE
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=c63b2294-7e6c-ef11-a4e6-00224850867f
Support UPSERTs and DELETEs when copying data into Lakehouse Tables from Pipeline copy activity, as opposed to Appending new rows
https://ideas.fabric.microsoft.com/ideas/idea/?ideaid=87f3d168-6022-ee11-a81c-6045bdc01ce4