Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
The problem is with lakehouse tables on which I have enabled this:
Hi @GammaRamma ,
One way to do this is to use the .mode(“overwrite”) and .option(“overwriteSchema”, “true”) options when writing data to the table. This method allows you to update the table schema without deleting the table, thus preserving incremental history.
Such as below code:
dataframe.write \
.format("delta") \
.mode("overwrite") \
.option("overwriteSchema", "true") \
.partitionBy(<your-partition-columns>) \
.saveAsTable("<your-table>")
Best Regards,
Adamk Kong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello @Anonymous, thank you for your response.
I have not tested this solution, but even if it does work, it doesn't really solve my problem the way I want. Using your method, I would have to create another table temporarily from my on prem source and then use a notebook to load the temp table as a dataframe and overwrite my original table.
In a real bind this could be useful, but it's really not very efficient for a design solution. I think this is a problem with the copy activity which should be adressed by Microsoft.
Thanks anyway!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.