Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
I’m encountering an issue while syncing data after creating a new shortcut in a Fabric Lakehouse. When I trigger metadata sync or try to query the table from the SQL Endpoint, I receive the following error:
Error Message:
“Resilience check failed: table change state is Unknown, indicating potential data inconsistency or storage communication issues.”
Has anyone faced this “Resilience check failed / table change state Unknown” issue recently?
Looking for guidance on:
Any help or insights would be greatly appreciated!
Hi @aniruddhabh,
Great to hear that it's working as expected on your end! Could you please share the solution? It would be really helpful for others in the community who might be facing similar issues and can address them quickly. Also, I would suggest accepting your approach as the solution so that it can benefit others as well.
Thanks & Regards,
Prasanna Kumar
The issue has been resolved.
Hi @aniruddhabh,
Could you please share how was the issue mitigated? Also any important learnings out of it? Thanks.
Hi @aniruddhabh ,
Thank you for reaching out to the Microsoft Fabric Forum Community, and special thanks to @stoic-harsh , @shekharkrdas and @ssrithar for prompt and helpful responses.
Just following up to see if the Response provided by community members were helpful in addressing the issue. if the issue still persists Feel free to reach out if you need any further clarification or assistance.
Best regards,
Prasanna Kumar
Issue has been resolved.
Hi @aniruddhabh ,
I have used the below to resolve the issue
--Use a Notebook to Force Metadata Registration. Even though the shortcut is created, the table may not be auto-registered in Spark/SQL. You can manually register it in Spark
-- Refresh Metadata in Spark Engine, Not Just SQL Endpoint. Sometimes the SQL Endpoint refresh UI doesn’t push updates correctly from OneLake. Run this in a Spark notebook
-- Check for Partial Commits or Delta Corruption.
Even if files exist, metadata sync may fail if:
There’s a partial _delta_log commit (e.g., _commit.json.tmp)
There’s a missing _delta_log/00000.json
If this post helps, then please appreciate giving a Kudos or accepting as a Solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
i also facing same issue. My data resides inside fabric
Hi @aniruddhabh,
Can you confirm if your shortcut points to a Delta table created outside Fabric (for example, Databricks or Synapse Spark)? Fabric Spark can read such tables, but the SQL Analytics Endpoint requires full ownership of the Delta metadata, and may fail when it encounters metadata Fabric didn’t create.
If this is the case, the simplest workaround would be to materialize the data inside Lakehoues using Dataflow Gen2 or Copy Activity, so the _delta_log is fully authored by Fabric. You can schedule refreshes if the external data is updated regularly.
Please share if your scenario is different, or if you find another workaround.