Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I would like to connect to a parquet in ADLS Gen 2 from a Fabric PySpark notebook. How to set the spark config?
Below results in error
Hi @meiwah,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @BhaveshPatel for the prompt response.
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi Anjan,
It has not been resolved. I would like to use SAS token to access a parquet file in ADLS Gen 2 from a Fabric PySpark notebook but the class org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider to set the config is not available in the Fabric environment as shown in the below error message. Is there another way for SAS to access a ADLS Gen 2? Thanks!
"Unable to load SAS token provider class: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider not foundjava.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider not found"
Hi @meiwah,
Based on your requirement, you can use Lakehouse shortcuts in Microsoft Fabric to connect to ADLS Gen2 using a SAS token. Please follow below steps:
If possible, consider switching to Service Principal authentication for better security and manageability, especially in cross-tenant scenarios.
Also, go through this similar post which might help you with your query.
Solved: read Azure Data Lake from notebook fabric - Microsoft Fabric Community
Thanks and regards,
Anjan Kumar Chippa
Hi @meiwah,
We wanted to kindly follow up to check if the solution provided by the user for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
You first need to connect to Azure Storage Explorer. ( Multi tenant )
Also You need a GroupID and WorkspaceID to connect to Microsoft Fabric to Azure Storage Explorer.
Apache Spark doesn't have ACID Transactions whereas Delta Lake does.
https://app.fabric.microsoft.com/groups/08946547d-e0b7-6578-b5ff-ff4a96567753056/synapsenotebooks/5a
Hi @meiwah
Is it Azure Spark ( Pyspark ) or Delta Lake ( both are Parquet files except Delta lake has transaction log with parquet).
Also, Please include why do you need to connect to ADLS Gen 2 ( Storage ). You first need to connect to Azure Storage Explorer.
Also You need a GroupID and WorkspaceID to connect to Microsoft Fabric to Azure Storage Explorer.
Hi Bhavesh, I'm running the code in PySpark notebook in Fabric environment, not in Azure. It is just a parquet and not delta. The reason for connecting to the ADLS is to receive the data there. And I'm using a SAS token becos it is cross tenant, meaning the fabric and adls are of different tenant
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.