Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Not sure if anyone else has experienced this but I have a table in my Lakehouse that I have frequently used for address attributes. When I try to view this table in a notebook by right clicking load data > spark, it says that the table doesnt exist, which is not true. When I query the table via the lakehouse sql endpoint, I can access it. Does anyone know why this is happening and how I can fix it?
Solved! Go to Solution.
May I know your workspacename? In your workspace if you have any special characters are used , some of lakehouse features may not work as expected. Pls have look on below snapshot
Also it is in preview mode. Please check limitations of here Lakehouse schemas (Preview) - Microsoft Fabric | Microsoft Learn It migh helps you.
Thank you!!!
Hi @burakkaragoz,
as mentioned this is known issue I’d encourage you to submit your detailed feedback and ideas via Microsoft's official feedback channels, such as the Microsoft Fabric Ideas
Feedback submitted here is often reviewed by the product teams and can lead to meaningful improvement.
Thanks,
Prashanth
Hi @AnthonySottile ,
Yes, this is a known issue that can occur in Fabric Lakehouse environments when there's a mismatch between the metadata layers used by Spark and the SQL endpoint.
Refresh the Lakehouse metadata
Try restarting the Lakehouse or reloading the workspace. Sometimes metadata caching causes this issue.
Check for _delta_log integrity
Navigate to the Lakehouse file explorer and ensure the _delta_log folder exists and is intact for that table.
Re-register the table
If the table was manually created or altered, try dropping and recreating the table using Spark SQL to ensure it's properly registered in the Spark catalog.
Use spark.read.format("delta") manually
In your notebook, try loading the table manually using:
df = spark.read.format("delta").load("Tables/YourTableName")
Hi @AnthonySottile,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?
Thanks,
Prashanth Are
MS Fabric community support
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query
hi @AnthonySottile,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?
Thanks,
Prashanth Are
MS Fabric community support
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query
Hi @AnthonySottile,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for your issue worked? or let us know if you need any further assistance here?
Thanks,
Prashanth Are
MS Fabric community support
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query
May I know your workspacename? In your workspace if you have any special characters are used , some of lakehouse features may not work as expected. Pls have look on below snapshot
Also it is in preview mode. Please check limitations of here Lakehouse schemas (Preview) - Microsoft Fabric | Microsoft Learn It migh helps you.
Thank you!!!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
53 | |
20 | |
12 | |
8 | |
3 |
User | Count |
---|---|
69 | |
50 | |
12 | |
11 | |
6 |