Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi, we've had synapse pipelines (running PySpark notebooks) reading and writing to our Fabric OneLake for quite some time. Last night we started seeing authentication failures when trying to invoke spark.read.parquet and when trying to write parquet files to the same lakehouse.
Our authentication setup involves adding the Synapse System Assigned Managed Identity to our Lakehouse as Contributor. We've had this setup for a few months so this was working.
Error messages are something like:
Py4JJavaError: An error occurred while calling o4256.parquet.
: java.nio.file.AccessDeniedException: Operation failed: "Forbidden", 403, HEAD, https://[OneLakeId].dfs.fabric.microsoft.com/[FilesystemId]/[LakehouseId]/Files/[Path]/FileContents?upn=false&action=getStatus&timeout=90
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.checkException(AzureBlobFileSystem.java:1443)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:652)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:640)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1760)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.exists(AzureBlobFileSystem.java:1236)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:120)
The URL appears to be the REST API call documented here: https://learn.microsoft.com/en-us/rest/api/storageservices/datalakestoragegen2/path/get-properties?v...
We can reproduce this behavior in interactive notebooks when configuring the session to run under SAMI, so there appears to be something wrong with the SAMI authenticating to our Lakehouse.
This ended up being a multi-day failure on Fabric's end and in a few days things started working again without any changes on our side. My guess is some auth issues between Synapse and OneLake.
Technically the issue is resolved but without a root cause and/or any communication, we've already started migrating our solution to something else and no longer using Microsoft Fabric.
Yes, I meant contributor of workspace. I did not configure OneLake data access before, it was not needed for things to work. While in my original post I referenced spark.parquet.read, we also made writes to the OneLake and things were working fine without tweaking the OneLake data access. I just explicitly added the SAMI to have full permissions over the entire OneLake; doesn't seem to make things better.
Regarding SAMI configuration settings, what else should I be looking for? I believe it cannot be changed so I don't know what else would change there.
I don't think there's an issue with network access; I'm able to execute the notebooks when it's running under my identity but when I set the notebook to run under managed identity, the call fails.
Hi @kchung_msft
The error message you mentioned, java.nio.file.AccessDeniedException: Operation failed: "Forbidden", 403, indicates a permissions issue. Here are a few suggestions to troubleshoot the issue:
Check Permissions: Ensure that the SAMI has the necessary permissions on the Fabric Lakehouse. You said it's added "to your Lakehouse as Contributor." Do you mean Contributor of workspace? Did you configure OneLake data access roles for the lakehouse?
Review Configuration: Verify that the configuration settings for the SAMI in your Synapse workspace are correct and haven’t been altered recently.
Network Access: Ensure that there are no network restrictions or firewall rules blocking access to the Fabric Lakehouse. Check if there is any change recently.
Best Regards,
Jing
Community Support Team
User | Count |
---|---|
82 | |
42 | |
16 | |
11 | |
7 |
User | Count |
---|---|
91 | |
87 | |
27 | |
8 | |
8 |