The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
I am using PySpark in my notebook to write to a delta table in my lakehouse, however the table is NOT showing up in the SQL endpoint. I did NOT check the option for Lakehouse Schemas when the lakehouse was created. Anything I can do?
Solved! Go to Solution.
The problem turned out to be that I was specifying a path when writing the table instead of letting it write to the default (wherever that is). When I removed the patch the tables appear in the SQL endpoint as expected. This occurs regardless of if schemas is enabled or not.
Hi @spartan27244 ,
Thank you for engaging with the Microsoft Fabric Community. Since the current Lakehouse was set up without Lakehouse Schemas enabled, any Delta tables created using PySpark aren't being registered in the SQL analytics endpoint, which is why they’re not showing up.
Creating a new Lakehouse with Lakehouse Schemas turned on. This setting ensures that tables written with PySpark are automatically registered and become visible in the SQL endpoint.
Once the new Lakehouse is ready, re-run your code to write the Delta table, and it should appear in the SQL view as expected.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
The problem turned out to be that I was specifying a path when writing the table instead of letting it write to the default (wherever that is). When I removed the patch the tables appear in the SQL endpoint as expected. This occurs regardless of if schemas is enabled or not.