Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Microsoft Fabric's OneLake serves as an integrated SaaS data lake for enterprise analytics. Integrating OneLake with Azure Databricks through a Service Principal enables secure and scalable data access for advanced engineering and analytics operations. The following guide details the process for establishing this connection, including Service Principal setup, federating a Fabric Lakehouse SQL endpoint, and configuring OAuth authentication.
Complete registration, then go to Certificates & Secrets and create a new client secret (e.g., FabricDatabricksSecret), selecting an appropriate expiry period.
Record the generated Client Secret Value, Application (Client) ID, and Directory (Tenant) ID from the overview section.
3. Select Delegated permissions and enable user_impersonation.
4. Grant admin consent for tenant access, if necessary.
5. Set the redirect URI by selecting the app registration and specifying the Databricks workspace URL. Click on the app registration and overview. Click on redirect URI’
6. Specify the databricks workspace URL - https://<workspace-url>/login/oauth/azure.html
2. Users can access OneLake data with external apps
3. Within the Fabric workspace, select Manage Access and add the Service Principal as a Contributor or viewer.
CREATE CONNECTION fabric_sql_connection
TYPE sqlserver
OPTIONS (
host ' .database.windows.net',
port '1433',
user 'service-principal-client-id',
password 'service-principal-secret'
);
2. Create a foreign catalog referencing the connection:
CREATE FOREIGN CATALOG IF NOT EXISTS fabric_sql_catalog
USING CONNECTION fabric_sql_connection
OPTIONS (database 'lakehouse_database_name');
Alternatively, using the UI,
1) Obtain the semantic endpoint from the Fabric Lakehouse settings.
2. In Databricks Unity Catalog, go to the Connections section, create a new connection, and input the following:
client_id = " "
tenant_id = " "
client_secret = " "
spark.conf.set("fs.azure.account.auth.type.onelake.dfs.fabric.microsoft.com", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type.onelake.dfs.fabric.microsoft.com", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id.onelake.dfs.fabric.microsoft.com", " client_id ")
spark.conf.set("fs.azure.account.oauth2.client.secret.onelake.dfs.fabric.microsoft.com", " client_secret ")
spark.conf.set("fs.azure.account.oauth2.client.endpoint.onelake.dfs.fabric.microsoft.com", "https://login.microsoftonline.com/ tenant_id /oauth2/token")
workspace_name = "fabrictest"
lakehouse_name = "testlakehouse01"
abfs_path = f"abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}.Lakehouse/Tables/"
df = spark.read.format("delta").load(f"{abfs_path}/publicholidays")
display(df.limit(10))
Establishing a connection between Azure Databricks and Microsoft Fabric OneLake using a Service Principal allows secure, scalable data access, enabling federated SQL endpoints via Unity Catalog. This workflow supports centralized governance and seamless integration with Fabric Lakehouse resources.
Public documentation for reference: Run federated queries on Microsoft SQL Server - Azure Databricks | Microsoft Learn
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.