Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi All,
I am looking for the advice on lakehouse tables consumption from the databricks.
Thanks in Advance.
Solved! Go to Solution.
AFAIK Databricks doesn't understand the Lakehouse table structure of Fabric, just the locations. Eg what I have done is create external tables in Databricks using a table location in fabric.
The Delta location itself in Fabric defines the table anyway, it's just it doesn't register the table in Databricks unless you create an external table over it.
Hi @av_9999 here are 2 articles to get you going:
https://learn.microsoft.com/en-us/fabric/onelake/onelake-azure-databricks
https://learn.microsoft.com/en-us/fabric/onelake/onelake-unity-catalog
Hi @AndyDDC Thanks for sharing it's really helpfull. In the 1st reference link, it given an example to read the files from lakehouse. Will it be the same way to read the lakeshouse tables?
Anythoughts...
AFAIK Databricks doesn't understand the Lakehouse table structure of Fabric, just the locations. Eg what I have done is create external tables in Databricks using a table location in fabric.
The Delta location itself in Fabric defines the table anyway, it's just it doesn't register the table in Databricks unless you create an external table over it.
Thanks @AndyDDC . I implemented the same in previous when i was workied with the Databricks.
User | Count |
---|---|
68 | |
38 | |
15 | |
14 | |
5 |
User | Count |
---|---|
73 | |
64 | |
25 | |
8 | |
7 |