Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I have a number of tables loaded from a large set of incremetal json files. Once my load completed I noticed that some of the tables I can see in the Lakehouse are not showing up in SQL Endpoint. Ive tried refreshing and restarting my browser but still no luck.
I also noted that SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES also does not include the missing Delta Tables but I can clearly see that they do exist in the Lakehouse and checking the properties of them in the Lakehouse, indicates all the missing tables are both "Managed" and "Delta" tables.
Ive been runnng initial loads "repeatedly" and my code drops the delta table for the initial load. It seems the ones missing are the ones that took the longest to load.
I have a Data Pipeline that uses a lookup to return metadata to a ForEach(runs all in parralel).
In the foreach I call a "controler" notebook, passing the metadata to it.
The controler notebook calls a "metadata parser" notebook and the result of the parser is passed to a call of my "data notebook". I was experiencing timeouts on longer running jobs so I set the Timeout property on the msu.notebook.run for the call to the "data notebook" to 2400 secs and the issue was resolved.
Im am running this inside a trail fabric capacity.
Any help appriciated.
Solved! Go to Solution.
Thanks @v-nikhilan-msft
I note that after the weekend that the missing tables have now showed up in SQL Endpoint. However not sure that the business is prepared to wait several days for data to be available after ingestion.
How are you loading these tables into lakehouse?
Im creating tables using:
The load "dbo" schema preview in lakehouse is causing a failure to load tables in the SQL Endpoint. Make sure not to click on this preview feature...it is buggy.
Thanks @v-nikhilan-msft
I note that after the weekend that the missing tables have now showed up in SQL Endpoint. However not sure that the business is prepared to wait several days for data to be available after ingestion.
How are you loading these tables into lakehouse?
Im creating tables using:
Hi @demothy
Thanks for providing the details. As you are able to see the tables now, this might be a temporary glitch. Please let me know if you face the issue again.
Thanks
I'm having the same issue, I can see the newly export delta table in Lakehouse, but not in SQL end point. I have retried refreshing table and reconnecting with no luck.
Hi @Jeanxyz,
I found this blog post that triggers the refresh of a lakehouse with a Python script. It works like a charm for me. I had the problem that the lakehouse had a lag of 30+ minutes. This Python script (which I used in a notebook) reduced it to 6 minutes and it waits until the sql endpoint is refreshed. So you are sure that other activities which are executed on completion of the Python notebook will have the refreshed sql endpoint.
https://www.obvience.com/blog/fix-sql-analytics-endpoint-sync-issues-in-microsoft-fabric-data-not-sh...
Hi @demothy
Thanks for using Fabric Community.
As per your description, the reason for tables not showing in SQL endpoint might be:
1) The visibility of tables in the SQL endpoint is more likely related to how the tables were created. If the tables were created with Spark code, they won’t be visible in the SQL endpoint. (This is not valid in your case as you have clearly specified the table properties are managed and delta)
2) The issue can be due to a schema mismatch. You might need to update the schema of your Delta tables or fix any schema mismatches to ensure that all tables are included in the INFORMATION_SCHEMA.TABLES.
3) If the initial loads for some tables took longer due to resource limitations, the Delta tables might not have been created successfully. This could explain the missing tables, even though they exist in the Lakehouse (but not fully formed).
Limitations of the SQL endpoint:
Limitations - Microsoft Fabric | Microsoft Learn
Try checking the data types of the columns: Data types - Microsoft Fabric | Microsoft Learn
How are you loading these tables into lakehouse? How large is the volume of the table?
Please let me know if you have any further questions.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
7 | |
4 | |
2 | |
2 | |
1 |
User | Count |
---|---|
13 | |
7 | |
5 | |
5 | |
4 |