Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Hello all,
second time trying to post this, as the first try was marked as spam (???).
I try to write a dataframe back to a delta table and get an error. This is the code I use to create the dataframe:
dfCustomer = spark.read.table("LakehouseOperations.factCustomerBase")
dfScope = spark.read.table("LakehouseOperations.tecDataScopeSnapshots")
dfCustomerJoined = dfCustomer.join(dfScope, dfCustomer.snapshot_date == dfScope.scopeDate, "inner").drop("scopeDate", "scopeDateBuckets")
dfCustomerJoined.write.mode("overwrite").format("delta").option("overwriteSchema", "true").save("Tables/factCustomerBase")
Then I get the error, see the screenshots here:
How can I solve this? Thanks!
Solved! Go to Solution.
The solution was to change the spark session configuration via the workspace settings:
What it actually does / means... I don't know.
Hi @JayJay11 ,
Thanks for using Fabric Community.
As part of testing, can you try to create a new table.
dfCustomerJoined.write.mode("overwrite").format("delta").option("overwriteSchema", "true").save("Tables/factCustomerBaseJoined")
Let me know the output of it.
The solution was to change the spark session configuration via the workspace settings:
What it actually does / means... I don't know.
I got the same error even after using the mentioned spark properties.
Glad to know that you were able to resolve your issue. Please continue using Fabric Community on your further queries.
User | Count |
---|---|
5 | |
4 | |
3 | |
2 | |
2 |
User | Count |
---|---|
10 | |
8 | |
7 | |
6 | |
6 |