Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello all,
second time trying to post this, as the first try was marked as spam (???).
I try to write a dataframe back to a delta table and get an error. This is the code I use to create the dataframe:
dfCustomer = spark.read.table("LakehouseOperations.factCustomerBase")
dfScope = spark.read.table("LakehouseOperations.tecDataScopeSnapshots")
dfCustomerJoined = dfCustomer.join(dfScope, dfCustomer.snapshot_date == dfScope.scopeDate, "inner").drop("scopeDate", "scopeDateBuckets")
dfCustomerJoined.write.mode("overwrite").format("delta").option("overwriteSchema", "true").save("Tables/factCustomerBase")
Then I get the error, see the screenshots here:
How can I solve this? Thanks!
Solved! Go to Solution.
The solution was to change the spark session configuration via the workspace settings:
What it actually does / means... I don't know.
Hi @JayJay11 ,
Thanks for using Fabric Community.
As part of testing, can you try to create a new table.
dfCustomerJoined.write.mode("overwrite").format("delta").option("overwriteSchema", "true").save("Tables/factCustomerBaseJoined")
Let me know the output of it.
The solution was to change the spark session configuration via the workspace settings:
What it actually does / means... I don't know.
I got the same error even after using the mentioned spark properties.
Glad to know that you were able to resolve your issue. Please continue using Fabric Community on your further queries.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.