Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
SukiB
Regular Visitor

Trouble querying data in ADLS Gen2 from Fabric using a shortcut

Hi.  Hoping someone knows how to tackle this one - stuck on it for a couple of days now 😞


When I run the following code:

 

df = spark.read.format("parquet").load("Files/landing-zone/yellow_tripdata_2024-01.parquet")
display(df)

 

It works as expected.

 

However, when I run the following code:-

 

df = spark.read.format("parquet").load("Files/landing-zone/*")

 

I receive the following error:

Spark_System_ABFS_OperationFailed - An operation with ADLS Gen2 has failed, which is typically due to a permissions issue.
Ensure that the user running the Spark job has the Storage Blob Data Contributor role assigned to all referenced ADLS Gen2 resources.


Check the Spark logs for the storage account name experiencing the issue.

 

I’ve confirmed that the Storage Blob Data Contributor role has been granted, but the logs indicate a SAS-related issue, which I find puzzling.

 

Not a Firewall issue.  Went to the extent of simply creating a new Storage Account - just in case.

 

Hoping that someone has encountered this before - if you have any insights into what might be going on?

1 ACCEPTED SOLUTION
SukiB
Regular Visitor

Quick update on this: -

I raised a support ticket with Microsoft Fabric Support, and they’ve conducted an in-depth investigation. They were able to replicate the issue when uploading a single Parquet file via the shortcut method above. For this test, I used NYC data from yellow_tripdata_2024-01.parquet.

Interestingly, the issue disappears when multiple files are present in storage or when using other file formats. This confirms that the problem is specific to single Parquet files accessed via shortcuts.

I've now shared this blog with the team, and they have escalated it as a potential bug. Once a solution is found, they will provide an update here.

In the meantime, as a workaround for Parquet files, I'm using: -


df = spark.read.parquet("abfss://container@storage_account.dfs.core.windows.net/*")

display(df)

Hope this helps anyone facing a similar issue!

SukiB

View solution in original post

17 REPLIES 17
SukiB
Regular Visitor

Quick update on this: -

I raised a support ticket with Microsoft Fabric Support, and they’ve conducted an in-depth investigation. They were able to replicate the issue when uploading a single Parquet file via the shortcut method above. For this test, I used NYC data from yellow_tripdata_2024-01.parquet.

Interestingly, the issue disappears when multiple files are present in storage or when using other file formats. This confirms that the problem is specific to single Parquet files accessed via shortcuts.

I've now shared this blog with the team, and they have escalated it as a potential bug. Once a solution is found, they will provide an update here.

In the meantime, as a workaround for Parquet files, I'm using: -


df = spark.read.parquet("abfss://container@storage_account.dfs.core.windows.net/*")

display(df)

Hope this helps anyone facing a similar issue!

SukiB

Anonymous
Not applicable

Hi Sukhminder,

 

Here’s a workaround that we can offer for this issue as this is a known bug.

 

In this case, we are placing the file at the container level, which makes it inaccessible due to a known bug in the fabric product. Below is the link to the Bug Item and a supporting document to mention it as a known issue.

 

Known Issues Tracker - [External ADLS] Shortcuts Load to Tables does not work

 

Bug Item created on this Bug 1548911 Head call to OneLake for External ADLS is failing at container level.

 

However, to ease your productivity, we have prioritized this and identified workarounds for testing:

 

  1. We need to place the files at the folder level within the container and create a shortcut to that specific folder where our parquet files are located.

vskallam_0-1741258181856.png

 

and To use the internal storage as a shortcut.

 

ETA for the bug fix is provided by the end of March.

 

v-saisrao-msft
Community Support
Community Support

Hi @SukiB,

Thank you for reaching out to the Microsoft Forum Community.

 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.

Thanks for following up. Unfortunately, the issue remains unresolved. I have raised a support ticket with Microsoft's Fabric Support Team, and they are currently investigating further. It doesn’t appear to be a straightforward issue to resolve, but as soon as I have a definitive solution, I’ll get back to you and share how we resolved it.

Hi @SukiB,

Thanks for your valuable feedback.

SukiB
Regular Visitor

Just trying to narrow this issue down - surprisingly this works...

df = spark.read.parquet("abfss://container@storage_account.dfs.core.windows.net/*")
display(df)

So - the issue is something to do - I guess - with how my Shortcuts in Fabric are connecting to the underlying storage account.  Is there a config in Fabric that I haven't factored in - by chance?
 

Not sure. If we have to do something special while creating shortcuts. Its a straightforward process in UI. 

anyways you can mark your last message as solution, as it might help someone struggling with this kind of issue 

spaceman127
Frequent Visitor

Hello @SukiB ,

 

 

have you checked ACL for the respective folder?
I have just tested what you have configured and it works perfectly.

In my example, I created folders manually and uploaded Parquet files to each folder. Then I tested it and it works.

How did you load the data? There may be a problem with loading the data and the subsequent authorizations.

 

Many Greetings

Thanks spaceman127.  I've created a brand new storage from sratch, and given permission to the user via IAM.  My Fabric username is part of the following: -

Owner, Contributor, Storage Blob Data Contributor and Storage Blob Data Reader.


Ref. the ACL - I've set the Security Principal as follows...

Owner: $superuser - Read, Write, Execute
Owning group: $superuser - Read, Write, Execute
Other - Read, Write, Execute


Still get the same error.

Hello @SukiB 


Did you Enable hierarchical namespace?

 

 

Verify execute permissions on parent folders and read permissions on files in ADLS Gen2’s hierarchical namespace

 

 

 

https://community.fabric.microsoft.com/t5/Fabric-platform/Not-able-to-connect-to-ADLS-with-shortcut-...

 

Thanks Nilendra.  

Hierarchical Namespace is enabled.  For manage ACL, I've provision everything to Read, Write and Execute - just in case.  Still no luch I'm afraid.

FabianSchut
Super User
Super User

I found a similar post here with some errors while reading multiple CSV files: https://community.fabric.microsoft.com/t5/Fabric-platform/Read-multiple-files-in-Fabric-Notebook/td-.... Can you try to only include the folder which contains the files, like this:

df = spark.read.format("parquet").load("Files/landing-zone")

Thanks FabianSchut.

Tried both the following...


df = spark.read.format("parquet").load("Files/landing-zone")
as well as
df = spark.read.format("parquet").option("header","true").load("Files/landing-zone")
again - same error...

Spark_System_ABFS_OperationFailed
An operation with ADLS Gen2 has failed. This is typically due to a permissions issue. 1. Please ensure that for all ADLS Gen2 resources referenced in the Spark job, that the user running the code has RBAC roles "Storage Blob Data Contributor" on storage accounts the job is expected to read and write from. 2. Check the logs for this Spark application. Inspect the logs for the ADLS Gen2 storage account name that is experiencing this issue.

Furthermore, do you know how the Shortcut Authorization is set up? Did you create the shortcut to ADLS Gen2 yourself, or did you use an existing shortcut? The shortcut is probably set up with a SAS-token, given the error message. Make sure that the SAS-token has sufficient permissions or set up a new shortcut with your user account as authorization and try again.

Created the Shortcut myself, to Azure Data Lake Storage Gen2 and using Account Key for authorisation.

What's weird is that whether I use Account Key or SAS Authorisation - it gives me the same error...


Spark_System_ABFS_OperationFailed
An operation with ADLS Gen2 has failed. This is typically due to a permissions issue. 1. Please ensure that for all ADLS Gen2 resources referenced in the Spark job, that the user running the code has RBAC roles "Storage Blob Data Contributor" on storage accounts the job is expected to read and write from. 2. Check the logs for this Spark application. Inspect the logs for the ADLS Gen2 storage account name that is experiencing this issue.

But, loading from OneDrive in Fabric works - so I know the code is good.  In summary...

df = spark.read.parquet("Files/landing-zone-sas/*") --doesn't work
df = spark.read.parquet("Files/landing-zone-accountkey/*") --doesn't work
df = spark.read.parquet("Files/landing-zone-onedrive/*") --works - this is the files being uploaded into onedrive

 
Any further thoughts - much appreciated.

nilendraFabric
Community Champion
Community Champion

Hello @SukiB 

Give this a try 
df = spark.read.parquet("Files/landing-zone/*.parquet")

df = spark.read.parquet("mylakehouse/Files/landing-zone/*.parquet")


Thanks

Thanks Nilendra.  Still no luck...

 

Spark_System_ABFS_OperationFailed
An operation with ADLS Gen2 has failed. This is typically due to a permissions issue. 1. Please ensure that for all ADLS Gen2 resources referenced in the Spark job, that the user running the code has RBAC roles "Storage Blob Data Contributor" on storage accounts the job is expected to read and write from. 2. Check the logs for this Spark application. Inspect the logs for the ADLS Gen2 storage account name that is experiencing this issue.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.