Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Did you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now

Reply
todd-wilson
Frequent Visitor

Refresh SQL Endpoint API

Items - Refresh Sql Endpoint Metadata - REST API (SQLEndpoint) | Microsoft Learn

I'm confused. We're calling this before we run some validations on the sql endpoint and the data is lagging so I think this only updates the metadata, but please let me know...

Does this API call only refresh metadata such as schema and object definitions or does it also pick up data changes? If it only refreshes the metadata what process allows someone to see new data on the endpoint (is it auto-magical)?

We're querying the the spark delta lake directly are we commit data so this isn't an issue, but Power BI users... what should they expect? Thank you for your help.

1 ACCEPTED SOLUTION
tayloramy
Super User
Super User

Hi @todd-wilson

 

@v-echaithra is incorrect. 

 

Refreshing the SQL Endpoint does update the data that is visible in the endpoint. 

https://blog.fabric.microsoft.com/en-us/blog/refresh-sql-analytics-endpoint-metadata-rest-api-now-in...

 

https://blog.fabric.microsoft.com/en-us/blog/refresh-sql-analytics-endpoint-metadata-rest-api-now-in...

 

That was the entire point of the metadata refresh API. 

 





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Join the Fabric Discord!

Proud to be a Super User!





View solution in original post

5 REPLIES 5
v-echaithra
Community Support
Community Support

Hi @todd-wilson ,

May I ask if you have resolved this issue? Please let us know if you have any further issues, we are happy to help.

Thank you.

v-echaithra
Community Support
Community Support

Hi @todd-wilson ,

We’d like to follow up regarding the recent concern. Kindly confirm whether the issue has been resolved, or if further assistance is still required. We are available to support you and are committed to helping you reach a resolution.

Thank you.

v-echaithra
Community Support
Community Support

Hi @tayloramy , @todd-wilson ,

Thanks for the clarification here.

To refine my earlier response, the Refresh SQL Endpoint operation not only syncs metadata but also makes newly committed data visible in the SQL endpoint. However, this happens through an eventually consistent layer, so a short delay can still occur before the latest data is reflected.

 

Hope this clarifies.

tayloramy
Super User
Super User

Hi @todd-wilson

 

@v-echaithra is incorrect. 

 

Refreshing the SQL Endpoint does update the data that is visible in the endpoint. 

https://blog.fabric.microsoft.com/en-us/blog/refresh-sql-analytics-endpoint-metadata-rest-api-now-in...

 

https://blog.fabric.microsoft.com/en-us/blog/refresh-sql-analytics-endpoint-metadata-rest-api-now-in...

 

That was the entire point of the metadata refresh API. 

 





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Join the Fabric Discord!

Proud to be a Super User!





v-echaithra
Community Support
Community Support

Hi @todd-wilson ,

Thanks for reaching out, this behavior is expected based on how the SQL analytics endpoint operates in Microsoft Fabric.

The Refresh SQL Endpoint Metadata API only synchronizes the metadata layer (tables, schema, object definitions) of the SQL endpoint with the underlying Lakehouse/Delta tables; it does not refresh or reload the actual data. As a result, even after data is committed to Delta via Spark, the SQL endpoint may still return stale results due to its decoupled and eventually consistent query layer, which can involve caching and asynchronous synchronization. This is why you’re observing a lag when querying the endpoint immediately after ingestion, even though querying Delta directly reflects the latest data.

To address this, there are two recommended approaches depending on your requirement.

Option 1 (preferred for deterministic scenarios): perform validations directly against the Lakehouse/Delta tables using Spark or Direct Lake, which guarantees immediate consistency after commit and avoids the SQL endpoint latency entirely.
Option 2 (if SQL endpoint must be used): introduce a controlled orchestration pattern, after writing data, call the metadata refresh API (if schema changes are involved), allow a short delay for the endpoint to synchronize, and for downstream consumers like Power BI, trigger a dataset refresh to ensure the latest data is picked up. This ensures more predictable behavior, although it still operates under eventual consistency rather than strict real time guarantees.

Hope this helps.
Chaithra E.

Helpful resources

Announcements
April Fabric Update Carousel

Fabric Monthly Update - April 2026

Check out the April 2026 Fabric update to learn about new features.

Fabric SQL PBI Data Days

Data Days 2026 coming soon!

Sign up to receive a private message when registration opens and key events begin.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.