Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
giupegiupe
Helper I
Helper I

bug #1069

Hi community

Does anyone have information of when this bug will be fixed ?

giupegiupe_0-1751047825663.png

Does anyone have any idea how to fix it of course by keeping in
              " .option("delta.columnMapping.mode", "name") \
                .option("delta.minReaderVersion", "2") \
               .option("delta.minWriterVersion", "5") \"

 

Create the tables in advance and use only append ?

  .mode("append").saveAsTable(Destination_table_full_name) # Append data

 

 

1 ACCEPTED SOLUTION
v-hashadapu
Community Support
Community Support

Hi @giupegiupe , Thank you for reaching out to the Microsoft Community Forum.

 

Create the Delta table in advance using a clean schema with only supported types and then only append data using .mode("append"). This avoids triggering metadata changes that break the sync. Avoid using overwrite, schema evolution (mergeSchema) or operations that alter the schema once the table is created. Try this:

df.write
.format("delta")
.option("delta.columnMapping.mode", "name")
.option("delta.minReaderVersion", "2")
.option("delta.minWriterVersion", "5")
.mode("append")
.saveAsTable(destination_table_full_name)

 

If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

View solution in original post

2 REPLIES 2
v-hashadapu
Community Support
Community Support

Hi @giupegiupe , Thank you for reaching out to the Microsoft Community Forum.

 

Create the Delta table in advance using a clean schema with only supported types and then only append data using .mode("append"). This avoids triggering metadata changes that break the sync. Avoid using overwrite, schema evolution (mergeSchema) or operations that alter the schema once the table is created. Try this:

df.write
.format("delta")
.option("delta.columnMapping.mode", "name")
.option("delta.minReaderVersion", "2")
.option("delta.minWriterVersion", "5")
.mode("append")
.saveAsTable(destination_table_full_name)

 

If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

@v-hashadapu 

Thanks for the reply and the solution I adopted for 3 projects was this but only for one it failed giving me the dissaliement for which I opened the discussion (1069 bug?? probabily not)

Basically unlike the first two projects where table creation and append always succeeded, in this new project only one thing changed: Create the Trial Capacity association and on the same day update the data via nootebook

It seems that it would be necessary to "wait" (how long?) before loading data, in this projet it was also a lot of data at once.
I say seems because I do not have a match with other situations, and confirmation that 1069 was probably not the problem comes from a project I encountered.

By renaming the tables and returning them to their original names, it seems that the alignment of the fields and thus the data magically return to normal, not for all table.


So the solution you proposed and I have already applied, for practical configuration needs, is the correct one but there is to add that you probably have to wait for the "deployment" of capacity before working on fabric with data import


So it is not clear to me why renaming brings back the tables aligned between paquet and endpoit and it is not clear to me when fabric becomes fully "usable"


Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.