Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
parasol54
Regular Visitor

Issue with Upsert Table Action in Metadata-Driven Pipeline to Lakehouse Table

Hi everyone,

I'm implementing a metadata-driven data pipeline using a custom control table, with the goal of dynamically parameterizing as many properties as possible.

While trying to parameterize the 'table action' field under the 'destination' section, I encountered the following error:

"Upsert is not a supported table action for Lakehouse Table."

Here’s some context:

  • The source data comes from a SAP HANA database.
  • The destination is a Delta table located in a Lakehouse within a Microsoft Fabric workspace.
  • I’ve enabled Change Data Feed (CDF) on all destination tables.
  • As a test, I hardcoded the table_action parameter to 'Upsert' and specified the appropriate key columns.
  • For the 'Overwrite' parameter, all looks good.

Despite this, the error persists, and the message isn't very descriptive. Has anyone faced this issue or found a workaround for enabling upsert-like behavior when working with Lakehouse tables in this scenario?

Any guidance would be greatly appreciated.

Thanks!

 

16 REPLIES 16
v-lgarikapat
Community Support
Community Support

Hi @parasol54 ,

Thanks for reaching out to the Microsoft fabric community forum.

@lbendlin ,

@jennratten ,

@BhaveshPatel 

Thanks for your prompt response

 

Since the Copy Activity in Microsoft Fabric doesn’t currently support the 'Upsert' action for Lakehouse Delta tables, a practical workaround is to load your data into a staging table using 'Overwrite' mode, then use a notebook (SQL or Python) to perform a  merge operation into your destination table based on defined key columns this approach simulates upsert behavior by updating matching records and inserting new ones, and it integrates well into a metadata-driven pipeline if you extend your control table to include merge rules.

 

Lakehouse and Delta Tables - Microsoft Fabric | Microsoft Learn

Options to get data into the Lakehouse - Microsoft Fabric | Microsoft Learn

Simplifying Data Ingestion with Copy Job: Upsert to Azure SQL Database & Overwrite to Fabric Lakehou...

 

Best Regards,

Lakshmi Narayana

Hi @parasol54 ,

I wanted to follow up and confirm whether you’ve had a chance to review the information we shared. If you have any questions or need further clarification, please don’t hesitate to reach out.

If you're still encountering any challenges, feel free to let us know we’d be glad to assist you further.

Looking forward to your response.

Best regards,
Lakshmi Narayana

 

 

Hi @parasol54 ,

 If your issue has been resolved, please consider marking the most helpful reply as the accepted solution. This helps other community members who may encounter the same issue to find answers more efficiently.

If you're still facing challenges, feel free to let us know we’ll be glad to assist you further.

Looking forward to your response.

Best regards,
LakshmiNarayana.

Hi @parasol54 ,

As we haven't received a response from you, we will proceed to close this thread. If you're still experiencing the issue, please feel free to create a new thread we’ll be happy to assist you further.

Thank you for your patience and continued support.

If our response was helpful, we kindly ask you to mark it as Accepted as Solution. This helps other community members find answers more efficiently.


Best Regards,

Lakshmi Narayana

BhaveshPatel
Community Champion
Community Champion

BhaveshPatel_0-1751517181783.png

Here you can see the Fabric SQL database = Demo Semantic Model = Demo Warehouse = Lakehouse. All four are the same and give you desired results.

so you are getting unified data. 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.
BhaveshPatel
Community Champion
Community Champion

Hi @parasol54 

 

In Power BI Dataflow Gen 2, 

Warehouse table = Lakehouse Delta table

 

BhaveshPatel_0-1751516641401.png

 

 

 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.
jennratten
Super User
Super User

Hello @parasol54 - 

Here is an example of how you can perform upserts on lakehouse tables within a metadata-driven pipline.

  • Run a notebook which contains this code below. 

 

Create variables which define the paths (parameters for each component are passed to the notebook from the data pipeline):

# build absolute paths to lakehouse storage locations
source__path = f"{source__protocol}://{source__storage_account}@{source__endpoint}/{source__container}/{source__directory}/"
destination__path = f"{destination__protocol}://{destination__storage_account}@{destination__endpoint}/{destination__container}/{destination__directory}/"

Define a function to perform the upsert.

def process_table(source__lakehouse_path, source__table_name, destination__lakehouse_path, destination__table_name, unique_id):
    # load the tables
    source__table = DeltaTable.forPath(source__lakehouse_path+source__table_name)
    destination__table = DeltaTable.forPath(destination__lakehouse_path+destination__table_name)
    # convert the source table to a dataframe
    df = source__table.toDF()
    # define a list of columns names
    column_names = df.columns
    # perform the merge (upsert) operation
    destination__table.alias('destination') \
    .merge(
        df.alias('updates'),
        'destination.{0} = updates.{0}'.format(unique_id)
    )\
    .whenMatchedUpdate(set = {col: f"updates.{col}" for col in column_names}) \
    .whenNotMatchedInsert(values = {col: f"updates.{col}" for col in column_names})\
    .execute()

 

Invoke the function for each row in the metadata/control/watermark table (json_array)

 
for i in json_array:
    source__table_name = i["source__table"]
    destination__table_name = i["destination__table"]
    unique_id = i["unique_id"]
    process_table(source__path, source__table_name, destination__path, destination__table_name, unique_id)

 

parasol54
Regular Visitor

Hi Bhavesh,


Thanks for the explanation! Just to clarify — is the “Upsert” option also unsupported when the destination is a Warehouse table (not a Lakehouse Delta table)?

I understand that Lakehouse relies on Delta Lake versioning and doesn’t support upsert through the Copy activity — but will this limitation still apply if I use Warehouse as the destination?

Ultimately, I’m trying to implement metadata-driven upserts — would that work better in the Warehouse environment?

Regards

BhaveshPatel
Community Champion
Community Champion

There is only overwrite mode of delta tables. It means there is no upsert needed. The reason is In Memory engine of Linux Engine aka Windows Engine.

 

You have to understand the whole concept of apache spark, Delta lakes and how they work.

 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

Hi Bhavesh,


Thanks for the explanation! Just to clarify — is the “Upsert” option also unsupported when the destination is a Warehouse table (not a Lakehouse Delta table)?

I understand that Lakehouse relies on Delta Lake versioning and doesn’t support upsert through the Copy activity — but will this limitation still apply if I use Warehouse as the destination?

Ultimately, I’m trying to implement metadata-driven upserts — would that work better in the Warehouse environment?

Regards

Hi Bhavesh,

Thanks for the explanation! Just to clarify — is the “Upsert” option also unsupported when the destination is a Warehouse table (not a Lakehouse Delta table)?

I understand that Lakehouse relies on Delta Lake versioning and doesn’t support upsert through the Copy activity — but will this limitation still apply if I use Warehouse as the destination?

Ultimately, I’m trying to implement metadata-driven upserts — would that work better in the Warehouse environment?

Regards

Hi Bhavesh,

Thanks for the explanation! Just to clarify — is the “Upsert” option also unsupported when the destination is a Warehouse table (not a Lakehouse Delta table)?

I understand that Lakehouse relies on Delta Lake versioning and doesn’t support upsert through the Copy activity — but will this limitation still apply if I use Warehouse as the destination?

Ultimately, I’m trying to implement metadata-driven upserts — would that work better in the Warehouse environment?

Regards

Hi Bhavesh,

Thanks for the explanation! Just to clarify — is the “Upsert” option also unsupported when the destination is a Warehouse table (not a Lakehouse Delta table)?

I understand that Lakehouse relies on Delta Lake versioning and doesn’t support upsert through the Copy activity — but will this limitation still apply if I use Warehouse as the destination?

Ultimately, I’m trying to implement metadata-driven upserts — would that work better in the Warehouse environment?

Regards

Hi @parasol54 

 

Warehouse tables in Fabric Data Warehouse = Lakehouse Tables.

 

That means Whatever you are doing in Lakehouse ( such as creating Delta table) is also applies to Data Warehouse Tables. ( Unified Data ) 

 

Lakehouse Delta Tables 

BhaveshPatel_0-1751704875070.png

 

Data Warehouse Tables

BhaveshPatel_1-1751705222058.png

No limitation applies. Whether you create a Lakehouse or Warehouse Tables. I would highly recommend you to use Notebooks and Delta Tables. That way you would understand Data Lake and how it migrated to Delta Lake.

 

 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

There is no overwrite either. There's versioning and time travel and vacuuming etc.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.