Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello everyone,
I’m trying to shortcut/mirror several Delta tables from Azure Databricks to Microsoft Fabric. These tables have the checkpoint feature enabled, and Fabric fails with the following error:
Error: The Delta table uses the v2Checkpoint feature, which is not supported.
Error code: UnsupportedV2Checkpoint
Exception type: Microsoft.DeltaLogParser.Exceptions.DeltaLogParserUserException
Questions:
How can I disable the v2Checkpoint feature on existing Databricks Delta tables so I can bring them into Fabric without compromising data ?
Why are some tables created with these features enabled while others are not? (Same workspace, similar pipelines.)
Details (from Databricks catalog for the tables that fail):
delta:
checkpointPolicy: "v2"
columnMapping.maxColumnId: "16"
columnMapping.mode: "name"
enableDeletionVectors: "true"
enableRowTracking: "true"
feature.clustering: "supported"
feature.columnMapping: "supported"
feature.deletionVectors: "supported"
feature.domainMetadata: "supported"
feature.identityColumns: "supported"
feature.invariants: "supported"
feature.rowTracking: "supported"
feature.v2Checkpoint: "supported"
lastCommitTimestamp: "1760449400000"
lastUpdateVersion: "302"
minReaderVersion: "3"
minWriterVersion: "7"
rowTracking.materializedRowCommitVersionColumnName: "_row-commit-version-col-dbf6ae64-37a5-49ee-9a03-d36feb509e0a"
rowTracking.materializedRowIdColumnName: "_row-id-col-0a7bda7f-5e15-4595-a6aa-c92029bac0e6"
Any guidance (workarounds, migration steps, or best practices to avoid enabling unsupported features) would be greatly appreciated. Thanks!
Solved! Go to Solution.
You can enforce classic checkpoints in Databricks by setting:
spark.conf.set("spark.databricks.delta.checkpoint.writeFormat", "json")
Hello, This is clear, thank you for your help.
Hi @MarouaneB,
Thank you for reaching out to the Microsoft Fabric Forum Community, and special thanks to @Thomaslleblanc for prompt and helpful response.
Just following up to see if the Response provided by community members were helpful in addressing the issue. if the issue still persists Feel free to reach out if you need any further clarification or assistance.
Best regards,
Prasanna Kumar
You can enforce classic checkpoints in Databricks by setting:
spark.conf.set("spark.databricks.delta.checkpoint.writeFormat", "json")