Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
meiwah
Resolver I
Resolver I

Spark Config for PySpark notebook using SAS

I would like to connect to a parquet in ADLS Gen 2 from a Fabric PySpark notebook. How to set the spark config?

Below results in error

# Set the authentication type to SAS
spark.conf.set(
    f"fs.azure.account.auth.type.{CONFIG_PREFIX}",
    "SAS"
)

# Specify the Fixed SASToken Provider class
spark.conf.set(
    f"fs.azure.sas.token.provider.type.{CONFIG_PREFIX}",
    "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider"
)

# Set the actual SAS token
spark.conf.set(
    f"fs.azure.sas.fixed.token.{CONFIG_PREFIX}",
    SAS_TOKEN
)

but there is an error 
"Unable to load SAS token provider class: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider not foundjava.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider not found"
7 REPLIES 7
v-achippa
Community Support
Community Support

Hi @meiwah,

 

Thank you for reaching out to Microsoft Fabric Community.

 

Thank you @BhaveshPatel for the prompt response.

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

Hi Anjan, 

 

It has not been resolved. I would like to use SAS token to access a parquet file in ADLS Gen 2 from a Fabric PySpark notebook but the class org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider to set the config is not available in the Fabric environment as shown in the below error message. Is there another way for SAS to access a ADLS Gen 2? Thanks!

"Unable to load SAS token provider class: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider not foundjava.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider not found"

Hi @meiwah,

 

Based on your requirement, you can use Lakehouse shortcuts in Microsoft Fabric to connect to ADLS Gen2 using a SAS token. Please follow below steps:

  • Navigate to the Lakehouse view in Fabric, then add a shortcut by providing the full DFS path of your ADLS Gen2 container.
  • When prompted for the authentication method, select SAS token and paste the token directly into the provided field.
  • Once the shortcut is created, it will mount the ADLS Gen2 container as a folder within your Lakehouse. You can then access this data from a PySpark notebook by adding the Lakehouse to your notebook.
  • After that, navigate to the desired file and use the auto-generated Spark code to read the data seamlessly.
    For example like "df = spark.read.format("parquet").load("Tables/<shortcut-folder>/<file-path>")"

If possible, consider switching to Service Principal authentication for better security and manageability, especially in cross-tenant scenarios.

 

Also, go through this similar post which might help you with your query.

Solved: read Azure Data Lake from notebook fabric - Microsoft Fabric Community

 

Thanks and regards,

Anjan Kumar Chippa

 

Hi @meiwah,

 

We wanted to kindly follow up to check if the solution provided by the user for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

BhaveshPatel
Community Champion
Community Champion

 

 

You first need to connect to Azure Storage Explorer. ( Multi tenant )  

BhaveshPatel_3-1760000866065.png

 

 

 

 

 

 

Also You need a GroupID and WorkspaceID to connect to Microsoft Fabric to Azure Storage Explorer. 

 

Apache Spark doesn't have ACID Transactions whereas Delta Lake does.

 

https://app.fabric.microsoft.com/groups/08946547d-e0b7-6578-b5ff-ff4a96567753056/synapsenotebooks/5a

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.
BhaveshPatel
Community Champion
Community Champion

Hi @meiwah 

 

Is it Azure Spark ( Pyspark ) or Delta Lake ( both are Parquet files except Delta lake has transaction log with parquet).

 

Also, Please include why do you need to connect to ADLS Gen 2 ( Storage ).   You first need to connect to Azure Storage Explorer.

BhaveshPatel_1-1760000462262.png

 

 

 

Also You need a GroupID and WorkspaceID to connect to Microsoft Fabric to Azure Storage Explorer.

 

https://app.fabric.microsoft.com/groups/08946547d-e0b7-6578-b5ff-ff4a96567753056/synapsenotebooks/5a...

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

Hi Bhavesh, I'm running the code in PySpark notebook in Fabric environment, not in Azure. It is just a parquet and not delta. The reason for connecting to the ADLS is to receive the data there. And I'm using a SAS token becos it is cross tenant, meaning the fabric and adls are of different tenant

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Fabric Update Carousel

Fabric Monthly Update - October 2025

Check out the October 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.