Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

sakshigupta_01

Connecting Databricks to Microsoft Fabric OneLake Using Service Principal and Unity Catalog

This article is co-authored by @sdhandapani and @sakshigupta_01 , Architects at Microsoft

Introduction

Microsoft Fabric's OneLake serves as an integrated SaaS data lake for enterprise analytics. Integrating OneLake with Azure Databricks through a Service Principal enables secure and scalable data access for advanced engineering and analytics operations. The following guide details the process for establishing this connection, including Service Principal setup, federating a Fabric Lakehouse SQL endpoint, and configuring OAuth authentication.

Prerequisites

  • Active Azure subscription with access to Microsoft Fabric and Databricks
  • Premium-tier Azure Databricks workspace
  • Administrator access to Microsoft Entra ID (previously Azure AD)
  • Contributor rights to the Fabric workspace
  • Unity Catalog enabled in Databricks

Section 1: Federating Fabric Lakehouse SQL Endpoint via Unity Catalog

Step 1: Register a Service Principal in Microsoft Entra ID

  1. Access the Azure Portal.
  2. Navigate to Microsoft Entra ID > App registrations and select New registration.
  3. Provide a name (e.g., FabricDatabricksSP) and choose Single tenant for account types.

sakshigupta_01_0-1756247149080.png

Complete registration, then go to Certificates & Secrets and create a new client secret (e.g., FabricDatabricksSecret), selecting an appropriate expiry period.

 

sakshigupta_01_1-1756247149096.png

 

Record the generated Client Secret Value, Application (Client) ID, and Directory (Tenant) ID from the overview section.

sakshigupta_01_0-1756247431061.pngsakshigupta_01_1-1756247438137.png

 

Step 2: Assign Permissions to the Service Principal

  1. In the Azure Portal, open Microsoft Entra ID > App registrations > API permissions.
  2. Add permission via APIs my organization uses, searching for Azure SQL

sakshigupta_01_2-1756247459517.png

3. Select Delegated permissions and enable user_impersonation.

4. Grant admin consent for tenant access, if necessary.

sakshigupta_01_3-1756247472165.png

5. Set the redirect URI by selecting the app registration and specifying the Databricks workspace URL. Click on the app registration and overview. Click on redirect URI’

sakshigupta_01_4-1756247484300.png

6. Specify the databricks workspace URL - https://<workspace-url>/login/oauth/azure.html

sakshigupta_01_5-1756247498697.png

 

Step 3: Grant Service Principal Access to Fabric Workspace

  1. In the Microsoft Fabric Admin Portal, enable the following:
    • Service principals can call Fabric public API

sakshigupta_01_6-1756247510907.png

2. Users can access OneLake data with external apps

sakshigupta_01_7-1756247534917.png

 

3. Within the Fabric workspace, select Manage Access and add the Service Principal as a Contributor or viewer.

sakshigupta_01_8-1756247548435.png

 

Step 4: Federate the Fabric SQL Endpoint Using Unity Catalog

  1. Create an external SQL connection:

CREATE CONNECTION fabric_sql_connection

TYPE sqlserver

OPTIONS (

host ' .database.windows.net',

port '1433',

user 'service-principal-client-id',

password 'service-principal-secret'

);

2. Create a foreign catalog referencing the connection:

CREATE FOREIGN CATALOG IF NOT EXISTS fabric_sql_catalog

USING CONNECTION fabric_sql_connection

OPTIONS (database 'lakehouse_database_name');

 

Alternatively, using the UI,

 

1) Obtain the semantic endpoint from the Fabric Lakehouse settings.

sakshigupta_01_9-1756247566911.png

2. In Databricks Unity Catalog, go to the Connections section, create a new connection, and input the following:

sakshigupta_01_10-1756247580054.png

 

  1. Host: Semantic endpoint from Fabric workspace
  2. Client Secret
  3. Client ID
  4. OauthScope: https://database.windows.net/.default
  5. Authorization endpoint: https://login.microsoftonline.com/ /oauth2/v2.0/authorize
  6. Database: Name of your lakehouse
  7. Finalize the configuration and review access permissions as needed. Attach compute resources to the Databricks workspace. Data from Fabric can now be read and queried in Databricks.

sakshigupta_01_11-1756247594961.png

 

 

Section 2: Connecting Databricks to OneLake via OAuth

  1. Create a Service Principal as outlined in Section 1, Step 1.
  2. Assign permissions to the Fabric workspace as described in Section 1, Step 3.
  3. Connect to Fabric from a Databricks notebook. Open a notebook and configure parameters either in the session or as Spark compute settings:

 

client_id = " "

tenant_id = " "

client_secret = " "

spark.conf.set("fs.azure.account.auth.type.onelake.dfs.fabric.microsoft.com", "OAuth")

spark.conf.set("fs.azure.account.oauth.provider.type.onelake.dfs.fabric.microsoft.com", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")

spark.conf.set("fs.azure.account.oauth2.client.id.onelake.dfs.fabric.microsoft.com", " client_id ")

spark.conf.set("fs.azure.account.oauth2.client.secret.onelake.dfs.fabric.microsoft.com", " client_secret ")

spark.conf.set("fs.azure.account.oauth2.client.endpoint.onelake.dfs.fabric.microsoft.com", "https://login.microsoftonline.com/ tenant_id /oauth2/token")

 

workspace_name = "fabrictest"

lakehouse_name = "testlakehouse01"

abfs_path = f"abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}.Lakehouse/Tables/"

df = spark.read.format("delta").load(f"{abfs_path}/publicholidays")

display(df.limit(10))

Summary

Establishing a connection between Azure Databricks and Microsoft Fabric OneLake using a Service Principal allows secure, scalable data access, enabling federated SQL endpoints via Unity Catalog. This workflow supports centralized governance and seamless integration with Fabric Lakehouse resources.

 

 

Public documentation for reference: Run federated queries on Microsoft SQL Server - Azure Databricks | Microsoft Learn

Comments

Informative @sakshigupta_01,
Thanks for sharing 

Thanks for sharing that, really nice. Do you have any additional insights on doing the opposite, reading data from Databricks Unity Catalog into Fabric? Mirrored Unity Catalog didn’t work for my case because my Databricks is behind a private endpoint.