The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredAsk the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.
when I made changes to the tables and resaved it with overwrite mode to the lakehouse ,I got an error saying 'Failed to load tables' in the lakehouse SQL analytics Endpoint view with error code 'BlobNotFound' .The detailed error message is shown in the screenshot.Its saving those two table in an unidentified folder in the lakehouse.I dont get it it was working fine before the changes in the tables.To temporary fix it I made a copy of the notebook and relinked the default lakehouse with different table name this time and it worked.Could you please explain why am getting this error.
Solved! Go to Solution.
Hi @Rutuja1997
Overwriting tables in Spark can sometime cause temporary mismatches between the Lakehouse’s Delta Lake metadata and the SQL Endpoint’s schema cache. The `BlobNotFound` error often indicates the SQL Endpoint is referencing outdated file paths
Very good tutorial to solve the similar issue :
https://m.youtube.com/watch?v=toTKGYwr278
Hi @Rutuja1997.
Thanks for reaching out to the Microsoft fabric community forum.
It looks like you are facing issue while saving your tables after you made some changes in it in the lakehouse. As @nilendraFabric already responded to your query, please go through the response and check if it solves your issue.
I would also take a moment to thank @nilendraFabric, for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Hammad.
Community Support Team
If this post helps then please mark it as a solution, so that other members find it more quickly.
Thank you.
Hi @Rutuja1997,
As we haven’t heard back from you, so just following up to our previous message. I'd like to confirm if you've successfully resolved this issue or if you need further help.
If yes, you are welcome to share your workaround and mark it as a solution so that other users can benefit as well. If you find a reply particularly helpful to you, you can also mark it as a solution.
If you still have any questions or need more support, please feel free to let us know. We are more than happy to continue to help you.
Thank you for your patience and look forward to hearing from you.
Hi @Rutuja1997,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution so that other community members can find it easily.
Thank you.
Hi @Rutuja1997,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @Rutuja1997
Overwriting tables in Spark can sometime cause temporary mismatches between the Lakehouse’s Delta Lake metadata and the SQL Endpoint’s schema cache. The `BlobNotFound` error often indicates the SQL Endpoint is referencing outdated file paths
Very good tutorial to solve the similar issue :
https://m.youtube.com/watch?v=toTKGYwr278
User | Count |
---|---|
16 | |
10 | |
8 | |
4 | |
3 |
User | Count |
---|---|
53 | |
22 | |
20 | |
17 | |
12 |