Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.
Hi,
I have a scenario where I need to write a pyspark dataframe into a storage account container directly. The fabric has Storage Blob Contributor permission over the container , however I am still getting an error as 403 as for no permissions to write.
Any hint as what i am missing here?
Solved! Go to Solution.
Hi @Subha45,
Even with RBAC roles like Storage Blob Contributor, ADLS Gen2 also requires filesystem-level ACLs.
Continue using the Lakehouse shortcut, which handles permissions internally in Microsoft Fabric for secure and seamless access.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi @Subha45,
Even with RBAC roles like Storage Blob Contributor, ADLS Gen2 also requires filesystem-level ACLs.
Continue using the Lakehouse shortcut, which handles permissions internally in Microsoft Fabric for secure and seamless access.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi @Subha45,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
@BhaveshPatel ,Thanks for your prompt response
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi,
Thanks!
However, my query still remains unresolved.
Thanks for the reply but due to some client security it's difficult for me to get the screenshots , however I can elaborate the situation.
I am trying to save a dataframe from notebook into a storage container folder as a file.
It's working saved if I have a shortcut to the location and uses the abfss path.
While when I am directly addressing the storage container folder using abfss path without any shortcut it's throwing the error as 403 : The request doesn't have the permission as an error.
So was wondering if we can only write into a ADLS gen 2 location only using the lakehouse shortcut reference through notebook or is there a way to write into the location directly
Hi @Subha45
Could you please elaborate on above issue. Show some examples with screenshot.