Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
anawast
Microsoft Employee
Microsoft Employee

SPN + Certificate for authentication to ADLS in Microsoft Fabric Notebooks.

I can read data from my ADL Gen2 Lake using SPN+ SPN Key through providing the following spark configurations. 

spark.conf.set("dfs.adls.oauth2.access.token.provider.type", "ClientCredential")
spark.conf.set("dfs.adls.oauth2.client.id", "<ADLSId>")
spark.conf.set("dfs.adls.oauth2.credential", "<ADLSCredential>")
spark.conf.set("dfs.adls.oauth2.refresh.url", adlsLoginUrl)

However, I now what to do the authnetication through SPN+Certificate. Can you help with the mechanism for the same?
1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @anawast,

Perhaps you can try to use azure-identity and azure-storage-filedatalake libraries to configure the connection with SPN and certificate:

%pip install azure-identity azure-storage-file-datalake

from azure.identity import ClientCertificateCredential
from azure.storage.filedatalake import DataLakeServiceClient

tenant_id = "<your-tenant-id>"
client_id = "<your-client-id>"
certificate_path = "<path-to-your-certificate>"

credential = ClientCertificateCredential(tenant_id, client_id, certificate_path)
service_client = DataLakeServiceClient(account_url="https://<your-account-name>.dfs.core.windows.net", credential=credential)

ClientCertificateCredential Constructor (Azure.Identity) - Azure for .NET Developers | Microsoft Lea...

Regards,

Xiaoxin Sheng

View solution in original post

3 REPLIES 3
anawast
Microsoft Employee
Microsoft Employee

This uses Client Secret for authentication. I want to use SPN certificate for authentication. 

Anonymous
Not applicable

Hi @anawast,

Perhaps you can try to use azure-identity and azure-storage-filedatalake libraries to configure the connection with SPN and certificate:

%pip install azure-identity azure-storage-file-datalake

from azure.identity import ClientCertificateCredential
from azure.storage.filedatalake import DataLakeServiceClient

tenant_id = "<your-tenant-id>"
client_id = "<your-client-id>"
certificate_path = "<path-to-your-certificate>"

credential = ClientCertificateCredential(tenant_id, client_id, certificate_path)
service_client = DataLakeServiceClient(account_url="https://<your-account-name>.dfs.core.windows.net", credential=credential)

ClientCertificateCredential Constructor (Azure.Identity) - Azure for .NET Developers | Microsoft Lea...

Regards,

Xiaoxin Sheng

Anonymous
Not applicable

Hi @anawast,

Perhaps you can take a look at the following script to load data from storage account if helps with your scenario:

from notebookutils import mssparkutils

# service principal
tenant_id = "<your-tenant-id>"
client_id = "<your-client-id>"
client_secret = "<your-client-secret>"

# Azure storage detail
storage_account_name = "<your-storage-account-name>"
container_name = "<your-container-name>"
file_path="<path_to_file>"

# Spark configuration
spark.conf.set("fs.azure.account.auth.type", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id", client_id)
spark.conf.set("fs.azure.account.oauth2.client.secret", client_secret)
spark.conf.set("fs.azure.account.oauth2.client.endpoint", f"https://login.microsoftonline.com/{tenant_id}/oauth2/token")

# Full file path
data_path = f"abfss://{container_name}@{storage_account_name}.dfs.core.windows.net/{file_path}"

# Read the data into a Spark DataFrame
df = spark.read.format("csv").option("header", "true").load(data_path)

# show the result
df.show()

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.