Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!

Reply
aniruddhabh
Microsoft Employee
Microsoft Employee

Resilience Check Failed — Table Change State Unknown After Shortcut Sync in SQL Analytics Endpoints

I’m encountering an issue while syncing data after creating a new shortcut in a Fabric Lakehouse. When I trigger metadata sync or try to query the table from the SQL Endpoint, I receive the following error:

Error Message:

“Resilience check failed: table change state is Unknown, indicating potential data inconsistency or storage communication issues.”

Context

  • The shortcut was created on a Delta folder that is fully accessible in OneLake.
  • After creation, the table appears under the Lakehouse but it does not appear under the SQL endpoint, refreshing fails.
  • I attempted the standard steps (refresh SQL endpoint, delete & recreate shortcut), but the state remains Unknown.

Steps Tried

  • Deleted and recreated the shortcut
  • Triggered Refresh on SQL Endpoint
  • Verified underlying Delta files exist and are readable

Ask

Has anyone faced this “Resilience check failed / table change state Unknown” issue recently?

Looking for guidance on:

  1. Whether there is a known workaround to force Fabric to rebuild the internal metadata
  2. Any recommended diagnostic checks

Any help or insights would be greatly appreciated!

10 REPLIES 10
v-pgoloju
Community Support
Community Support

Hi @aniruddhabh,

 

Great to hear that it's working as expected on your end! Could you please share the solution? It would be really helpful for others in the community who might be facing similar issues and can address them quickly. Also, I would suggest accepting your approach as the solution so that it can benefit others as well.

 

Thanks & Regards,

Prasanna Kumar

Hi, it turned out to be an internal issue on the platform side. The support team applied a fix, and that resolved the problem.
aniruddhabh
Microsoft Employee
Microsoft Employee

The issue has been resolved.

Hi @aniruddhabh,

Could you please share how was the issue mitigated? Also any important learnings out of it? Thanks.

Hi, it turned out to be an internal issue on the platform side. The support team applied a fix, and that resolved the problem.
v-pgoloju
Community Support
Community Support

Hi @aniruddhabh ,

 

Thank you for reaching out to the Microsoft Fabric Forum Community, and special thanks to @stoic-harsh , @shekharkrdas and @ssrithar  for prompt and helpful responses.

Just following up to see if the Response provided by community members were helpful in addressing the issue. if the issue still persists Feel free to reach out if you need any further clarification or assistance.

 

Best regards,
Prasanna Kumar

 

Issue has been resolved.

ssrithar
Resolver II
Resolver II

Hi @aniruddhabh ,

 

I have used the below to resolve the issue

 

--Use a Notebook to Force Metadata Registration. Even though the shortcut is created, the table may not be auto-registered in Spark/SQL. You can manually register it in Spark

 

-- Refresh Metadata in Spark Engine, Not Just SQL Endpoint. Sometimes the SQL Endpoint refresh UI doesn’t push updates correctly from OneLake. Run this in a Spark notebook

 

-- Check for Partial Commits or Delta Corruption. 

Even if files exist, metadata sync may fail if:

  • There’s a partial _delta_log commit (e.g., _commit.json.tmp)

  • There’s a missing _delta_log/00000.json

 

If this post helps, then please appreciate giving a Kudos or accepting as a Solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

 

 

shekharkrdas
New Member

i also facing same issue. My data resides inside fabric

stoic-harsh
Frequent Visitor

Hi @aniruddhabh,

 

Can you confirm if your shortcut points to a Delta table created outside Fabric (for example, Databricks or Synapse Spark)? Fabric Spark can read such tables, but the SQL Analytics Endpoint requires full ownership of the Delta metadata, and may fail when it encounters metadata Fabric didn’t create.

 

If this is the case, the simplest workaround would be to materialize the data inside Lakehoues using Dataflow Gen2 or Copy Activity, so the _delta_log is fully authored by Fabric. You can schedule refreshes if the external data is updated regularly.

 

Please share if your scenario is different, or if you find another workaround.

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.