Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
I've implemented a dataflow gen2 to perform some transformations respect to source data and then copying them in the destination.
Moreover, I've created a data pipeline to use this dataflow but I've noticed and tried that when running the pipeline the destination is overwritten.
Now, does a dataflow allow to write data in the destination in append mode and not in overwrite mode, please?
I don't understand because a such option doesn't exist. Thanks
Solved! Go to Solution.
If you turn off the automatic settings,
you can use append function rather than replace in dataflow gnen 2
If you turn off the automatic settings,
you can use append function rather than replace in dataflow gnen 2
@NandanHegde In case of Append, if we are using a source like Sharepoint Folder or otherwise and want to bring the incremental changes only, the data is getting duplicated in Lakehouse. Is there a way to get around this.
If you want to append only new records, you can achieve that by following the pattern in these docs https://learn.microsoft.com/en-us/fabric/data-factory/tutorial-setup-incremental-refresh-with-datafl...
If you want to do Upsert, then I think you need to use Notebook. It is shown code examples for Upsert in this tutorial: https://microsoftlearning.github.io/mslearn-fabric/Instructions/Labs/03b-medallion-lakehouse.html
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.