Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
alfBI
Helper V
Helper V

Save a spark dataframe in a Lakehouse with Schema Support enabled (Feasible?)

Hi,

 

We have created a Lakehouse with Schema support enabled.  Then we have developed a notebook to save a pyspark dataframe as a delta table

 

spark_df.write.mode("append").format("delta").option("overwriteSchema", "true").partitionBy("MonthKey").saveAsTable("staffestablishmentplan")
 
but following error occurs
 
IllegalArgumentException: requirement failed: The provided partitioning does not match of the table. - provided: identity(MonthKey) - table:
 
 
Looks like is it is not able so refer the table. I have tried replacing the table name by the dbo.<TableName> but error remains.
 
Is it not possible to save tables using notebooks on lakehouse with schema support?
 
I don-t see that on the list of current limitations
 
 
Regrards,
 
 
 
1 ACCEPTED SOLUTION
Vinodh247
Resolver IV
Resolver IV

it should be possible but from the error it seems like a partition mismatch issue, not a problem with schema support itself.

  • Once a Delta table is created (example: staffestablishmentplan), its partitioning columns are fixed unless the table is dropped and recreated. So, if the table already exists without partitions or with different partitions, then this code:  spark_df.write.mode("append").format("delta").option("overwriteSchema", "true").partitionBy("MonthKey").saveAsTable("staffestablishmentplan") will throw an error.
  • Check the existing table partitioning: spark.sql("DESCRIBE DETAIL staffestablishmentplan").select("partitionColumns").show(truncate=False)

    If the output is [], then the table was created without partitions. In that case:

    • You cannot append with a new partitioning scheme unless you drop and recreate the table.

    • You need to either:

      • Remove .partitionBy("MonthKey") when appending or

      • Drop and recreate the table with the desired partition.

  • If you are still early in development and can afford to overwrite the table (if possible)

Finaly recommendation to try:

  • Check if the table already exists
  • If it exists without MonthKey partition, you have two options:
    • drop it
    • then run your saveAsTable with partitionBy
  • Or append without partitionBy if you cannot afford to drop it

Please 'Kudos' and 'Accept as Solution' if this answered your query.

View solution in original post

1 REPLY 1
Vinodh247
Resolver IV
Resolver IV

it should be possible but from the error it seems like a partition mismatch issue, not a problem with schema support itself.

  • Once a Delta table is created (example: staffestablishmentplan), its partitioning columns are fixed unless the table is dropped and recreated. So, if the table already exists without partitions or with different partitions, then this code:  spark_df.write.mode("append").format("delta").option("overwriteSchema", "true").partitionBy("MonthKey").saveAsTable("staffestablishmentplan") will throw an error.
  • Check the existing table partitioning: spark.sql("DESCRIBE DETAIL staffestablishmentplan").select("partitionColumns").show(truncate=False)

    If the output is [], then the table was created without partitions. In that case:

    • You cannot append with a new partitioning scheme unless you drop and recreate the table.

    • You need to either:

      • Remove .partitionBy("MonthKey") when appending or

      • Drop and recreate the table with the desired partition.

  • If you are still early in development and can afford to overwrite the table (if possible)

Finaly recommendation to try:

  • Check if the table already exists
  • If it exists without MonthKey partition, you have two options:
    • drop it
    • then run your saveAsTable with partitionBy
  • Or append without partitionBy if you cannot afford to drop it

Please 'Kudos' and 'Accept as Solution' if this answered your query.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.