Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
r_Gao
Advocate I
Advocate I

Failed to complete the command because the underlying location does not exist LakehouseSQL Endpoint

Our data hasn't been udpated in a week and this query that we have been running has been running successfully for a week until today. We got

 

SELECT * FROM table WHERE 1=1 AND id = <id>
 AND date >= '[3 years from now]' ORDER BY updated_date

"internalMessage" : "('42000', \"[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Failed to complete the command because the underlying location does not exist. Underlying data description: table <table> (24596) (SQLExecDirectW)\")"

Has anyone encountered this and know how to fix this?

1 ACCEPTED SOLUTION
nilendraFabric
Super User
Super User

Hello @r_Gao 

 

The SQL Endpoint’s metadata (table/file references) becomes temporarily out of sync with the actual Delta tables in the Lakehouse. This causes queries to reference outdated file paths (e.g., GUID-based Parquet files that no longer exist). The error code `24596` explicitly indicates this mismatch

 

This issue was discussed many times in the forum. Although it is never mentioned in known issues in Fabric docs

 

try Manually trigger a metadata refresh on the Lakehouse via Fabric’s UI to force the SQL Endpoint to sync

 

Right-click Lakehouse → Refresh

 

Modify your pipeline to include a retry loop with a 30–60-second wait between attempts

workarounds like forced refreshes and retries often resolve it. For critical systems, consider using Delta Lake directly

 

if this is helpful please accept the answer 

View solution in original post

8 REPLIES 8
v-prasare
Community Support
Community Support

@r_Gao As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?

 

if issue still persists i suggest you to raise a support ticket here. so, that they can assit you in addressing the issue you are facing. please follow below link on how to raise a support ticket:

How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

 

 

Thanks,

Prashanth Are

MS Fabric community support

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query

v-prasare
Community Support
Community Support

@r_GaoAs we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query

v-prasare
Community Support
Community Support

@r_Gao,  As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?

 

@nilendraFabric, Thanks for your promt response.

 

 

 

Thanks,

Prashanth Are

MS Fabric community support

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query

r_Gao
Advocate I
Advocate I

running the same query in a %%sql cell in a notebook shows me the data, just not through the sql endpoint

`%%sql` bypasses the SQL Endpoint and runs SparkSQL directly against the Lakehouse’s Delta tables


Spark reads Delta transaction logs directly from `/tables`, avoiding SQL Endpoint sync delays

 

For now if you want to use sql. Go with this approach. 

if this issue persists raise a support ticket 

 

nilendraFabric
Super User
Super User

Hello @r_Gao 

 

The SQL Endpoint’s metadata (table/file references) becomes temporarily out of sync with the actual Delta tables in the Lakehouse. This causes queries to reference outdated file paths (e.g., GUID-based Parquet files that no longer exist). The error code `24596` explicitly indicates this mismatch

 

This issue was discussed many times in the forum. Although it is never mentioned in known issues in Fabric docs

 

try Manually trigger a metadata refresh on the Lakehouse via Fabric’s UI to force the SQL Endpoint to sync

 

Right-click Lakehouse → Refresh

 

Modify your pipeline to include a retry loop with a 30–60-second wait between attempts

workarounds like forced refreshes and retries often resolve it. For critical systems, consider using Delta Lake directly

 

if this is helpful please accept the answer 

I've tried hitting the following buttons and it still gives me the same erorr 

Screenshot 2025-02-27 at 3.37.06 PM.png

Screenshot 2025-02-27 at 3.37.03 PM.png

@r_Gao did you manage to solve this issue?

I also tried the refresh buttons but it did not fix it

@nilendraFabric any ideas?

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.