Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
royclack
Frequent Visitor

Ingest data from Databricks (on Azure)

Hi,

 

We have an external area which uses Databricks (on Azure) with a Warehouse (SQL Serverless Compute) and would like to integrate data from it but are just at initial stages.

 

As I understand the Databricks lakehouse data information acess security is object level security via Unity Catalogue. I have spun up a trial environment alongside a colleague who is looking at preview feature to mirror the unity catalogue.

 

Outside of preview features I will be looking at native supported capababilities and I can see the potentional with Dataflows and possiblities with Data Factory. Is there a way to connect to Databricks via OneLake short-cuts through configuration/setup to directly query Databricks and/or using Notebooks or other options?

 

Thanks

1 ACCEPTED SOLUTION
v-veshwara-msft
Community Support
Community Support

Hi @royclack ,

Thanks for using Microsoft Fabric Community and sharing the details.

In addition to @jennratten suggestions,

It’s important to ensure the proper permissions are in place for the Unity Catalog in Databricks. In addition to SELECT, the EXTERNAL USE SCHEMA permission must be explicitly granted for access to the schema in Unity Catalog. Without this, even if the catalog is mirrored, you might not be able to access the data.


As mentioned by @smpa01 , it’s possible that mirroring is being blocked by your Databricks admin. If that’s the case, a Databricks Personal Access Token (PAT) paired with a service principal will be necessary to programmatically access the data. This allows you to bypass mirroring restrictions and directly consume the data in Fabric.

Once the Databricks PAT is configured, make sure the Spark-JDBC driver is placed in the Lakehouse. This will allow you to connect to Databricks and retrieve data programmatically.


It’s also worth noting that Databricks mirroring only supports managed and external tables. If the table you're trying to access is temporary or not part of Unity Catalog, it won't be available for direct querying via Fabric.

 

Some useful resources: Microsoft Fabric Mirrored Catalog From Azure Databricks (Preview) - Microsoft Fabric | Microsoft Lea...

Microsoft Fabric Mirrored Databases From Azure Databricks (Preview) Tutorial - Microsoft Fabric | Mi...

Troubleshoot Fabric Mirrored Databases - Microsoft Fabric | Microsoft Learn

After verifying the permissions, admin settings, and ensuring the proper table types are being mirrored, let us know if the issue persists.

Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.

Thank you.

View solution in original post

8 REPLIES 8
v-veshwara-msft
Community Support
Community Support

Hi @royclack ,

We’re following up once more regarding your query. If it has been resolved, please mark the helpful reply as the Accepted Solution to assist others facing similar challenges.

If you still need assistance, please let us know.
Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @royclack ,

Following up to see if your query has been resolved. If any of the responses helped, please consider marking the helpful reply as the 'Accepted Solution' to assist others with similar questions.

If you're still need assistance, feel free to reach out.

Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @royclack ,

Just checking in to see if you query is resolved and if any responses were helpful. If so, kindly consider marking the helpful reply as 'Accepted Solution' to help others with similar queries. 

Otherwise, feel free to reach out for further assistance.

Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @royclack ,

Thanks for using Microsoft Fabric Community and sharing the details.

In addition to @jennratten suggestions,

It’s important to ensure the proper permissions are in place for the Unity Catalog in Databricks. In addition to SELECT, the EXTERNAL USE SCHEMA permission must be explicitly granted for access to the schema in Unity Catalog. Without this, even if the catalog is mirrored, you might not be able to access the data.


As mentioned by @smpa01 , it’s possible that mirroring is being blocked by your Databricks admin. If that’s the case, a Databricks Personal Access Token (PAT) paired with a service principal will be necessary to programmatically access the data. This allows you to bypass mirroring restrictions and directly consume the data in Fabric.

Once the Databricks PAT is configured, make sure the Spark-JDBC driver is placed in the Lakehouse. This will allow you to connect to Databricks and retrieve data programmatically.


It’s also worth noting that Databricks mirroring only supports managed and external tables. If the table you're trying to access is temporary or not part of Unity Catalog, it won't be available for direct querying via Fabric.

 

Some useful resources: Microsoft Fabric Mirrored Catalog From Azure Databricks (Preview) - Microsoft Fabric | Microsoft Lea...

Microsoft Fabric Mirrored Databases From Azure Databricks (Preview) Tutorial - Microsoft Fabric | Mi...

Troubleshoot Fabric Mirrored Databases - Microsoft Fabric | Microsoft Learn

After verifying the permissions, admin settings, and ensuring the proper table types are being mirrored, let us know if the issue persists.

Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.

Thank you.

smpa01
Super User
Super User

Cases whre mirroring is blocked by databricks admin, you need a databricks PAT with service principal to programmatically consume the data from databricks to fabric lakehouse + jdbc driver placed in a file in lakehouse

spark-jdbc 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
jennratten
Super User
Super User

Hello @royclack - mirroring the unity catalog is intended to provide the most efficient and seamless, near real-time access to the data and provides the added benefit of avoiding duplication.  After the mirror is created you can create a lakehouse then add shortcuts to the data referenced in the mirrored unity catalog, which can be used as if they were actual tables. Once this part is done, begin orchestrations/explorations with notebooks, data pipelines, etc. Data pipelines and notebooks will make the most efficient use of the capacity/compute resources, so go that route if you can.  Please let me know if you have any other questions or if there is a specific reason you may be looking to do something outside of the mirror.

If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.

Proud to be a Microsoft Fabric Super User

Hi @jennratten 

Have been looking at the mirroring and changed tennent setting, to allow the feature outside of trial capacity usage. Working with my colleague we've setup a Databricks 14 day trial, added some data to create a table (managed type) under the default schema and mirrored the catalogue. It appears we are experiencing an issue, as we can see the replicated catalogue in Fabric but cannot view the data.

 

We have added permissions on the Databricks side to grant external schema and select permissions. In Fabric we get the message:

 

Couldn't access data. Check your permissions for this catalog or try again later.

Have you checked to confirm that users have the required USE CATALOG, USE SCHEMA, or SELECT permissions on the Databricks side?

The cluster might also not be configured to support fine-grained access control (FGAC) or identity passthrough - or there might be a network policy or firewall rule that is blocking access.

You can check your permission for databricks by querying with SQL.

SELECT * FROM information_schema.catalog_privileges
WHERE grantee = '<your_username>';
SELECT * FROM information_schema.schema_privileges
WHERE grantee = '<your_username>';
SELECT * FROM information_schema.table_privileges
WHERE grantee = '<your_username>';

You can also check via the databricks UI:

Navigate to the Unity Catalog section of the UI and reviewing the permissions for catalogs, schemas, and tables.

 

If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.

Proud to be a Microsoft Fabric Super User

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.