Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hey, I have taken 60day trail for the fabric,
created the table using spark notebook here is the code I have used.
df.write.mode("overwrite").format("delta").saveAsTable("Salesforce_lead_full")
it shows up in the lakehouse
but refuses to show up in the sql end point or in the model so that i can use this dataset in powerbi to make something out of it.
other interesting thing is when i click on properties of the table I see blank value in the Format field
and i can see the table files in the one lake file explorer. waited for 2 days now and its been killing me. we are doing a POC of fabric so far its been super buggy.
please help
Solved! Go to Solution.
If you created your Lakehouse with the schemas (preview) enabled, then the SQL Analytics Endpoint won't work. Please ref. these docs: Lakehouse schemas (Preview) - Microsoft Fabric | Microsoft Learn
Please try to create your Lakehouse without the schemas (keep that option unchecked).
What if your lakehouse is already created and you can't create a new lakehouse? Is there a property that can be reset?
If you created your Lakehouse with the schemas (preview) enabled, then the SQL Analytics Endpoint won't work. Please ref. these docs: Lakehouse schemas (Preview) - Microsoft Fabric | Microsoft Learn
Please try to create your Lakehouse without the schemas (keep that option unchecked).
Thanks a ton, that worked. I have been banging my head for 2 days now.
Hi @sun-sboyanapall ,
Thanks for @AndyDDC 's prompt reply, you can try his suggestion first.
I have some suggestions for your problem as well. The issue you are experiencing could be due to a number of reasons, the SQL Analytics Endpoint cache may not be updated in a timely manner. Try refreshing or reconnecting to see the newly added tables. And data synchronisation may take some time. Wait for a few minutes and check again.
If the above methods still didn't help you solve the problem, you can consider clearing the cache and re-adding the new table, then refresh it to see if the new table is loaded in SQL Analytics Endpoint.
Best Regards,
Ada Wang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I have actually tried all of this from other posts but it was that checkmark that was the issue in my case.
hi @sun-sboyanapall @can you try writing using:
df.write.mode("overwrite").format("delta").save("Tables/Salesforce_lead_full")
This was my issue - I have a custom function which writes dataframes to delta tables, and it was working, showing up as a table in my lakehouse and as a file in my lakehouse - what it wasn't doing was actually creating a table in my SQL endpoint "warehouse", writing it like this ,made it visible in my lakehouse sql endpoint schema.
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 7 | |
| 3 | |
| 3 | |
| 3 | |
| 3 |
| User | Count |
|---|---|
| 27 | |
| 13 | |
| 9 | |
| 8 | |
| 5 |