Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hello all,
second time trying to post this, as the first try was marked as spam (???).
I try to write a dataframe back to a delta table and get an error. This is the code I use to create the dataframe:
dfCustomer = spark.read.table("LakehouseOperations.factCustomerBase")
dfScope = spark.read.table("LakehouseOperations.tecDataScopeSnapshots")
dfCustomerJoined = dfCustomer.join(dfScope, dfCustomer.snapshot_date == dfScope.scopeDate, "inner").drop("scopeDate", "scopeDateBuckets")
dfCustomerJoined.write.mode("overwrite").format("delta").option("overwriteSchema", "true").save("Tables/factCustomerBase")
Then I get the error, see the screenshots here:
How can I solve this? Thanks!
Solved! Go to Solution.
The solution was to change the spark session configuration via the workspace settings:
What it actually does / means... I don't know.
Hi @JayJay11 ,
Thanks for using Fabric Community.
As part of testing, can you try to create a new table.
dfCustomerJoined.write.mode("overwrite").format("delta").option("overwriteSchema", "true").save("Tables/factCustomerBaseJoined")
Let me know the output of it.
The solution was to change the spark session configuration via the workspace settings:
What it actually does / means... I don't know.
I got the same error even after using the mentioned spark properties.
Glad to know that you were able to resolve your issue. Please continue using Fabric Community on your further queries.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.
User | Count |
---|---|
13 | |
5 | |
4 | |
3 | |
2 |