Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
The problem is with lakehouse tables on which I have enabled this:
Hi @GammaRamma ,
One way to do this is to use the .mode(“overwrite”) and .option(“overwriteSchema”, “true”) options when writing data to the table. This method allows you to update the table schema without deleting the table, thus preserving incremental history.
Such as below code:
dataframe.write \
.format("delta") \
.mode("overwrite") \
.option("overwriteSchema", "true") \
.partitionBy(<your-partition-columns>) \
.saveAsTable("<your-table>")
Best Regards,
Adamk Kong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello @Anonymous, thank you for your response.
I have not tested this solution, but even if it does work, it doesn't really solve my problem the way I want. Using your method, I would have to create another table temporarily from my on prem source and then use a notebook to load the temp table as a dataframe and overwrite my original table.
In a real bind this could be useful, but it's really not very efficient for a design solution. I think this is a problem with the copy activity which should be adressed by Microsoft.
Thanks anyway!
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.