Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
GammaRamma
Helper I
Helper I

Cannot overwrite tables with copy activity when change data feed is enabled

The problem is with lakehouse tables on which I have enabled this:

 

spark.sql(f"ALTER TABLE {rawLkhParam}.{tableSchemaParam}.{tableNameParam} SET TBLPROPERTIES ('delta.enableChangeDataFeed' = 'true')")
 
After this is enabled, if I try to overwrite the table with a copy activity in a data pipeline, I get the following error:
 
Failure happened on 'destination' side. 'Type=System.InvalidOperationException,Message=Cannot downgrade the minimum writer version,Source=Microsoft.DataTransfer.ClientLibrary,'
 
I can't find any way to downgrade the minimum writer. I am using schema enabled lakehouses. I don't want to delete the table before running the pipeline because then I loose the delta history whereas with a copy activity overwrite I can keep it.
 
Any help is appreciated. Thanks!
 
 
 
2 REPLIES 2
Anonymous
Not applicable

Hi @GammaRamma ,

 

One way to do this is to use the .mode(“overwrite”) and .option(“overwriteSchema”, “true”) options when writing data to the table. This method allows you to update the table schema without deleting the table, thus preserving incremental history.

Such as below code:

dataframe.write \
    .format("delta") \
    .mode("overwrite") \
    .option("overwriteSchema", "true") \
    .partitionBy(<your-partition-columns>) \
    .saveAsTable("<your-table>")

 

Best Regards,
Adamk Kong

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hello @Anonymous, thank you for your response.

I have not tested this solution, but even if it does work, it doesn't really solve my problem the way I want. Using your method, I would have to create another table temporarily from my on prem source and then use a notebook to load the temp table as a dataframe and overwrite my original table. 

In a real bind this could be useful, but it's really not very efficient for a design solution. I think this is a problem with the copy activity which should be adressed by Microsoft.

Thanks anyway!

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.