Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi All,
I am looking for the advice on lakehouse tables consumption from the databricks.
Thanks in Advance.
Solved! Go to Solution.
AFAIK Databricks doesn't understand the Lakehouse table structure of Fabric, just the locations. Eg what I have done is create external tables in Databricks using a table location in fabric.
The Delta location itself in Fabric defines the table anyway, it's just it doesn't register the table in Databricks unless you create an external table over it.
Hi @av_9999 here are 2 articles to get you going:
https://learn.microsoft.com/en-us/fabric/onelake/onelake-azure-databricks
https://learn.microsoft.com/en-us/fabric/onelake/onelake-unity-catalog
Hi @AndyDDC Thanks for sharing it's really helpfull. In the 1st reference link, it given an example to read the files from lakehouse. Will it be the same way to read the lakeshouse tables?
Anythoughts...
AFAIK Databricks doesn't understand the Lakehouse table structure of Fabric, just the locations. Eg what I have done is create external tables in Databricks using a table location in fabric.
The Delta location itself in Fabric defines the table anyway, it's just it doesn't register the table in Databricks unless you create an external table over it.
Thanks @AndyDDC . I implemented the same in previous when i was workied with the Databricks.
User | Count |
---|---|
81 | |
45 | |
16 | |
11 | |
7 |
User | Count |
---|---|
92 | |
88 | |
27 | |
8 | |
8 |