Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Score big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount

Reply
schneiw
Advocate II
Advocate II

Data Pipeline copy activity unexpected behavior

Hi

 

I have a usecase where I want to update only 2 fields in a Lakehouse table. In this instance I am trying to use the Copy Activity in a Data Pipeline using the Upsert option in the destination. I am only retrieving the primary key and 2 staic values e.g.

select key_value,

          getdate() as deleted_date,

          1 as is_deleted

from source_table

 

I set the upsert option and provide the Key Columns column name. It runs 'successfully' and I have checked that rows are flowing through the component via the output log after it completes (and via the preview data before running). 

 

If I check the destination table, the columns do not get updated. What am I doing wrong?

 

I have also tried with no mapping set, and with the 3 columns mapped, makes no difference.

1 ACCEPTED SOLUTION

Thank you for suggesting these options. My destination is a Lakehouse table.

 

I have a support call with MS yesterday and we found that it was because I was using dynamic values for the table schema and name. If I set it the harcoded table name then it works. This is of course an internal bug with the product so I can only hope they fix it in a timely manner.

View solution in original post

2 REPLIES 2
v-echaithra
Community Support
Community Support

Hi @schneiw ,

Thank you for reaching out to Microsoft Community.

Based on your use case and troubleshooting efforts, it seems that you're attempting to update only two fields (deleted_date and is_deleted) in a Lakehouse Delta table using the Copy Activity with the Upsert option in a Data Pipeline. Although the pipeline runs successfully and the data preview confirms that rows are flowing through, the changes are not reflected in the destination table. This could be due to several common and intermittent issues.

First, ensure that the destination table is a Delta Lake table, as Delta is required for Upsert operations to function correctly. If the table is not Delta-enabled, the update may silently fail without throwing an error. Also, verify that the table is not read-only or locked, and that the target columns (deleted_date, is_deleted) are nullable or have default values—otherwise, writes may be blocked or silently dropped due to schema constraints.

Next, confirm that you have correctly configured the Key Columns in the Upsert settings and that your data mapping aligns with the schema. Although you've tried both with and without column mapping, inconsistencies in column names or data types can still cause issues. Also, check the Copy Activity run output JSON, particularly for "rowsUpdated" or "rowsUpserted" counts. If those values are zero, it may indicate that the key values do not match any existing records, or that the data failed schema validation.
If you're working with Dataverse or connected environments, consider turning on logging (e.g., Dataverse Plugin Trace Logs) or enabling auditing for the target table and fields. This can help you trace incoming data and identify whether the upsert operation is hitting the backend as expected.

Finally, ensure that the Copy activity is not silently failing due to data mismatches. By default, it will fail or skip incompatible rows. You can configure it to skip and log incompatible rows to help isolate issues without stopping the pipeline.

Hope this helps.
Warm regards,
Chaithra E

Thank you for suggesting these options. My destination is a Lakehouse table.

 

I have a support call with MS yesterday and we found that it was because I was using dynamic values for the table schema and name. If I set it the harcoded table name then it works. This is of course an internal bug with the product so I can only hope they fix it in a timely manner.

Helpful resources

Announcements
August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.