The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hi all,
I have a quick question. I know that MS is already on track of the current issue with Lakehouse where SQL endpoint is taking some time to sync the metadata with lakehouse. Does we have any ETA when it will be fixed ? The delay in the lakehouse sql endpoint make it useless, as it sometimes take a lot of time.
Thanks
Solved! Go to Solution.
As my issue is ETL / pipeline related, neither of that helps.
What I did now as a soloution until MS releases the API which @frithjof_v mentions, is to use the "unsupported script" to update SQL endpoints written by somone at MSFT found here:
What I did, if someone is interested to:
Best regards,
Alexander
Hi,
I was about to create a topic on what I believe is a similar issue.
In my pipeline, I have an audit step that utilizes SQL endpoints for auditing. However, these checks often fail falsely because the queries seem to be based on outdated data. They do not reflect updates that have just occurred in the pipeline. For example, if Table X initially has 100 records and the pipeline inserts 50 more, the table count still shows 100 instead of 150.
I believe this is related to the issue being discussed. Is the only solution coming up the one refered about API update trigger?
//Alexander
Hi p_da ,
AlexanderPowBI shared a potential workaround. If possible, please review the response and give it a try to see if it resolves the issue
Thank You.
Hi @AlexanderPowBI ,
Thanks for bringing this up. It looks like the issue you're facing is related to the ongoing delay in SQL Endpoint metadata sync in Lakehouse. As mentioned earlier, SQL Endpoints may take a bit of time to reflect recent changes, which can lead to outdated query results.
Here are a couple of alternative approaches that might help.
If you're using Power BI, switching to Direct Lake mode can bypass SQL Endpoints and provide you with near real-time data access.
Instead of relying only on SQL Endpoints, consider querying your Lakehouse data using Notebooks or the Lakehouse REST API to get the most up-to-date results.
Hope this helps! If you need any further clarification, feel free to ask.
If my response solved your query, please mark it as the Accepted solution to help others find it easily!
And if my answer was helpful, I'd really appreciate a 'Kudos'.
As my issue is ETL / pipeline related, neither of that helps.
What I did now as a soloution until MS releases the API which @frithjof_v mentions, is to use the "unsupported script" to update SQL endpoints written by somone at MSFT found here:
What I did, if someone is interested to:
Best regards,
Alexander
Hi @AlexanderPowBI ,
Thank you for providing your insights and workaround. For further assistance, please continue the discussion in the Fabric Community.
Regards,
Yugandhar.
Hi @p_da ,
we wanted to check in as we haven't heard back from you. Did our solution work for you? If you need any more help, please don't hesitate to ask. Your feedback is very important to us. We hope to hear from you soon.
Thank You.
Hi @p_da ,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution we provided for your issue worked for you or let us know if you need any further assistance?
Your feedback is important to us, Looking forward to your response.
Thank You.
Hi @p_da ,
Thank you for reaching out to the Microsoft Fabric community.
We are aware that the SQL Endpoint problem is impacting your work. Instead, we recommend using the Direct Lake.
Direct Lake enables the handling of large amounts of data efficiently. It loads delta tables stored as Parquet files in One Lake, the single storage for analytics data, into memory quickly. Thereafter, direct queries on its semantic model deliver high-performance queries. Direct Lake thereby eliminates slow, expensive data imports. This results in fast loading of data with frequent and quick refresh operations and optimizes capacity resources efficiently.
For more details, please refer to the attached official Microsoft documentation on Direct Lake. It provides comprehensive information to address your queries.
Link: Direct Lake overview - Microsoft Fabric | Microsoft Learn.
If my answer addressed your query, kindly mark it as the Accepted Solution to assist others.
I'd also be grateful for a 'Kudos' if you found my response useful!
Hi @p_da ,
We noticed we haven't received a response from you yet, so we wanted to follow up and ensure the solution we provided addressed your issue. If you require any further assistance or have additional questions, please let us know.
Your feedback is valuable to us, and we look forward to hearing from you soon.
Thank You.
This is the only planned improvement I know about: https://learn.microsoft.com/en-us/fabric/release-plan/data-warehouse#refresh-sql-analytics-endpoint-...
REST API for refreshing SQL Analytics Endpoint.
That feature is scheduled for Q1 2025.