Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
aniruddhabh
Microsoft Employee
Microsoft Employee

Resilience Check Failed — Table Change State Unknown After Shortcut Sync in SQL Analytics Endpoints

I’m encountering an issue while syncing data after creating a new shortcut in a Fabric Lakehouse. When I trigger metadata sync or try to query the table from the SQL Endpoint, I receive the following error:

Error Message:

“Resilience check failed: table change state is Unknown, indicating potential data inconsistency or storage communication issues.”

Context

  • The shortcut was created on a Delta folder that is fully accessible in OneLake.
  • After creation, the table appears under the Lakehouse but it does not appear under the SQL endpoint, refreshing fails.
  • I attempted the standard steps (refresh SQL endpoint, delete & recreate shortcut), but the state remains Unknown.

Steps Tried

  • Deleted and recreated the shortcut
  • Triggered Refresh on SQL Endpoint
  • Verified underlying Delta files exist and are readable

Ask

Has anyone faced this “Resilience check failed / table change state Unknown” issue recently?

Looking for guidance on:

  1. Whether there is a known workaround to force Fabric to rebuild the internal metadata
  2. Any recommended diagnostic checks

Any help or insights would be greatly appreciated!

1 ACCEPTED SOLUTION

Hi, it turned out to be an internal issue on the platform side. The support team applied a fix, and that resolved the problem.

View solution in original post

13 REPLIES 13
v-pgoloju
Community Support
Community Support

Hi @aniruddhabh,

 

Thanks for the update.

 

Regards,

Prasanna Kumar

v-pgoloju
Community Support
Community Support

Hi @aniruddhabh,

 

Great to hear that it's working as expected on your end! Could you please share the solution? It would be really helpful for others in the community who might be facing similar issues and can address them quickly. Also, I would suggest accepting your approach as the solution so that it can benefit others as well.

 

Thanks & Regards,

Prasanna Kumar

Hi, it turned out to be an internal issue on the platform side. The support team applied a fix, and that resolved the problem.
aniruddhabh
Microsoft Employee
Microsoft Employee

The issue has been resolved.

Hi @aniruddhabh,

Could you please share how was the issue mitigated? Also any important learnings out of it? Thanks.

Hi, it turned out to be an internal issue on the platform side. The support team applied a fix, and that resolved the problem.
v-pgoloju
Community Support
Community Support

Hi @aniruddhabh ,

 

Thank you for reaching out to the Microsoft Fabric Forum Community, and special thanks to @stoic-harsh , @shekharkrdas and @ssrithar  for prompt and helpful responses.

Just following up to see if the Response provided by community members were helpful in addressing the issue. if the issue still persists Feel free to reach out if you need any further clarification or assistance.

 

Best regards,
Prasanna Kumar

 

Issue has been resolved.

ssrithar
Resolver II
Resolver II

Hi @aniruddhabh ,

 

I have used the below to resolve the issue

 

--Use a Notebook to Force Metadata Registration. Even though the shortcut is created, the table may not be auto-registered in Spark/SQL. You can manually register it in Spark

 

-- Refresh Metadata in Spark Engine, Not Just SQL Endpoint. Sometimes the SQL Endpoint refresh UI doesn’t push updates correctly from OneLake. Run this in a Spark notebook

 

-- Check for Partial Commits or Delta Corruption. 

Even if files exist, metadata sync may fail if:

  • There’s a partial _delta_log commit (e.g., _commit.json.tmp)

  • There’s a missing _delta_log/00000.json

 

If this post helps, then please appreciate giving a Kudos or accepting as a Solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

 

 

shekharkrdas
New Member

i also facing same issue. My data resides inside fabric

stoic-harsh
Resolver I
Resolver I

Hi @aniruddhabh,

 

Can you confirm if your shortcut points to a Delta table created outside Fabric (for example, Databricks or Synapse Spark)? Fabric Spark can read such tables, but the SQL Analytics Endpoint requires full ownership of the Delta metadata, and may fail when it encounters metadata Fabric didn’t create.

 

If this is the case, the simplest workaround would be to materialize the data inside Lakehoues using Dataflow Gen2 or Copy Activity, so the _delta_log is fully authored by Fabric. You can schedule refreshes if the external data is updated regularly.

 

Please share if your scenario is different, or if you find another workaround.

Hi @stoic-harsh, I'm not the originator of this thread but since AI brought me here I thought it would be fair to share a valuable finding that my Agent gave me. Regarding your suggestion to materialize data inside Fabric—while that definitely works, there is a specific technical reason why the SQL Endpoint "rejects" these external tables. If you are from the Microsoft Team feel free to correct; I don't want to induce confusion to AI (and regular humans).

 

"Resilience check failed" on Databricks Shortcuts

The "Resilience check failed" error is typically a platform validation mismatch rather than data corruption. It occurs due to a difference in how Databricks and the Fabric SQL Endpoint validate Delta Lake transaction logs.

The Root Cause: Missing Version 0 

Databricks automatically cleans up the _delta_log based on retention policies, such as the default 30-day window. Since Databricks can reconstruct table state from a checkpoint, it often deletes the initial commit (Version 0) once it is no longer needed for time travel.

However, while Fabric Spark can read these tables, the Fabric SQL Analytics Endpoint enforces a "Chain-of-Custody" validation that mandates the existence of Version 0. If that specific file is missing, the endpoint flags the table as a failure.

 

Diagnostic Test (Fabric Notebook)

You can use this PySpark snippet to confirm if your shortcut is missing the required initial commit:

# Check if the mandatory Version 0 file exists in the Delta Log
path = "abfss://[workspace_id]@onelake.dfs.fabric.microsoft.com/[item_id]/Tables/[table_name]/_delta_log/00000000000000000000.json"
try:
    notebookutils.fs.ls(path)
    print("Version 0 exists - Table should be accessible.")
except:
    print("Version 0 is missing - This triggers the Resilience Check failure.")

 

Recommended Remediation

  • Modify Source Retention: In Databricks, increase delta.logRetentionDuration to ensure the initial log history is preserved.
  • Force Fresh Logs: Run an OPTIMIZE command or a dummy write to the source table to generate a fresh checkpoint and transaction chain.

Note 1: data remains 100% safe and readable via Fabric Spark Notebooks.This issue specifically impacts the SQL Analytics Endpoint and Direct Lake connectivity (only for DirectLake on SQL endpoint, not DirectLake on OneLake)

Note 2: as of today, Fabric Spark support V2 checkpoints on Runtime 1.3 only but DO NOT support V2 checkpoints on SQL Engine (but it's on the roadmap). For Fabric SQL, only classic V1 Checkpoints is supported (no support for multi-part checkpoints either)

Hi @rabbyn,

Thanks for sharing. I will definitely give a try to what you suggested (looking for version 0 in the Delta log in Fabric vs. Databricks) and update here with the findings.

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Free Fabric Certifications

Free Fabric Certifications

Get Fabric certified for free! Don't miss your chance.

January Fabric Update Carousel

Fabric Monthly Update - January 2026

Check out the January 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors