Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
I have a problem: I need to extract data from an on-premise Oracle database and load it into Lakehouse. Currently, I use the "overwrite" option of the "Activity Copy" feature in the Microsoft Fabric Pipeline, but I would like to replace it with "upsert".
I'm trying to understand how to use Upsert in the Microsoft Fabric Pipeline, but I read that upsert is not supported when using Lakehouse as the destination. After reading more about the "Upsert" option, it appears there is an option to use the Data Factory's Copy Job features, but I'd like to understand if there's a simpler option and if this feature is indeed unavailable in the pipeline.
Solved! Go to Solution.
You're correct Upsert is not currently supported in Copy Activity when the destination is a Lakehouse (Delta table) in Microsoft Fabric.
For Lakehouse targets, Copy Activity only supports:
There isn’t a built-in Upsert option for Delta tables in Fabric pipelines at the moment.
If you need Upsert behavior (insert new rows and update existing ones), the recommended approach is:
This is currently the standard and supported pattern in Fabric for incremental loads into Lakehouse.
So unfortunately, there isn’t a simpler “toggle” solution in the pipeline itself right now using MERGE via Notebook or SQL is the correct way to handle it.
Hi @anabealmeidas ,
Upsert is NOT supported in Copy activity when the sink is a Lakehouse table in Microsoft Fabric.
In Microsoft Fabric Pipelines, the Copy activity supports Upsert only for certain relational sinks (like SQL databases).
When your destination is a Lakehouse (Delta table):
The Copy activity supports:
Append
Overwrite
It does NOT support:
Upsert
Merge logic
Key-based updates
Instead of using Copy Upsert, use this architecture:
Step 1 — Copy to Staging Table (Append mode)
Use Copy activity to load Oracle data into:
A staging Delta table in the Lakehouse Or a staging folder
Step 2 — Perform MERGE using Spark (Notebook)
Use a Fabric Notebook to run Delta MERGE INTO.
This is the best way to accomplish this.
If this post helps, then please appreciate giving a Kudos or accepting as a Solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
as of last month "merge" has gone general audience, so yes upsert is now possible. You have to use a script activity for this.
Hi @anabealmeidas ,
Upsert is NOT supported in Copy activity when the sink is a Lakehouse table in Microsoft Fabric.
In Microsoft Fabric Pipelines, the Copy activity supports Upsert only for certain relational sinks (like SQL databases).
When your destination is a Lakehouse (Delta table):
The Copy activity supports:
Append
Overwrite
It does NOT support:
Upsert
Merge logic
Key-based updates
Instead of using Copy Upsert, use this architecture:
Step 1 — Copy to Staging Table (Append mode)
Use Copy activity to load Oracle data into:
A staging Delta table in the Lakehouse Or a staging folder
Step 2 — Perform MERGE using Spark (Notebook)
Use a Fabric Notebook to run Delta MERGE INTO.
This is the best way to accomplish this.
If this post helps, then please appreciate giving a Kudos or accepting as a Solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Thanks for help me!
You're correct Upsert is not currently supported in Copy Activity when the destination is a Lakehouse (Delta table) in Microsoft Fabric.
For Lakehouse targets, Copy Activity only supports:
There isn’t a built-in Upsert option for Delta tables in Fabric pipelines at the moment.
If you need Upsert behavior (insert new rows and update existing ones), the recommended approach is:
This is currently the standard and supported pattern in Fabric for incremental loads into Lakehouse.
So unfortunately, there isn’t a simpler “toggle” solution in the pipeline itself right now using MERGE via Notebook or SQL is the correct way to handle it.
Thanks for help me!
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 7 | |
| 4 | |
| 3 | |
| 3 | |
| 3 |