Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I am trying to output a csv file from my lakehouse to S3 using copyJob activity ( I have also tried pipeline) but get the same error. I get the erorr suggesting that the AWS user (based on key and Secret) do not have permissions to create bucket. I am not trying to create the bucket but trying to put files into the bucket. This S3 is from my customer and i cannot get create bucket permissions.. The user has listBuckets, listBucket, get and put permisisons.
Solved! Go to Solution.
Hi @ljois,
Thank you for the response and confirmation. Since the regional endpoint fix is now correct, so the issue is likely because of missing multipart upload permissions. Even for small files the Fabric/ADF may initiate multipart APIs in the background. Please ask the bucket owner to add the following object-level permissions(if not already present):
And also additionally confirm that your IAM user has s3:HeadBucket at the bucket level. This is the specific API used by Fabric to validate the bucket before writing, and without it the service may misinterpret the failure as a like the create bucket permission issue.
If it still fails after adding this, please check the bucket policy(not just the IAM policy). The bucket policy may explicitly deny cross-account access unless certain conditions are met.
And check the ownership/ACL settings, for example if the bucket is in a different AWS account, check and confirm whether Object Ownership = Bucket owner enforced is enabled.
Thanks and regards,
Anjan Kumar Chippa
I have tried the regional endpoint and it still fails with the same reason. Apart from multipart permissions all of them are there, the files are very small so i dont know if that could be the problem. I will ask to add permissions for multipart and see
Hi @ljois ,
So if you only need explicit permissions to read the bucket, the following are sufficient.
As far as I understand, this is only about reading.
- s3:ListAllMyBuckets
- s3:GetBucketLocation
- s3:ListBucket
- s3:GetObject
We use these permissions for our S3 buckets, and they work.
Best regards
Hi @ljois,
Thank you for the response and confirmation. Since the regional endpoint fix is now correct, so the issue is likely because of missing multipart upload permissions. Even for small files the Fabric/ADF may initiate multipart APIs in the background. Please ask the bucket owner to add the following object-level permissions(if not already present):
And also additionally confirm that your IAM user has s3:HeadBucket at the bucket level. This is the specific API used by Fabric to validate the bucket before writing, and without it the service may misinterpret the failure as a like the create bucket permission issue.
If it still fails after adding this, please check the bucket policy(not just the IAM policy). The bucket policy may explicitly deny cross-account access unless certain conditions are met.
And check the ownership/ACL settings, for example if the bucket is in a different AWS account, check and confirm whether Object Ownership = Bucket owner enforced is enabled.
Thanks and regards,
Anjan Kumar Chippa
Hi @ljois,
We wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi @ljois,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi @ljois,
Thank you for reaching out to Microsoft Fabric Community.
The error here is not because fabric is really trying to create a new bucket, but it happens when the S3 connection is pointing at the wrong endpoint/region. Please follow below steps:
With the correct regional endpoint and these permissions, the copy job will succeed without needing CreateBucket rights.
Thanks and regards,
Anjan Kumar Chippa
Hi Anjan - Similar issue hear and we've followed the advice provided with no success. See error and steps taken below:
The following permissions have been granted to the role for the 1172-input-forma-ai bucket:
ErrorCode=UserErrorWriteFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path: '1172-input-forma-ai/silver_dates.txt'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Amazon.S3.AmazonS3Exception,Message=User: arn:aws:iam::900632042242:user/1172_access is not authorized to perform: s3:CreateBucket on resource: "arn:aws:s3:::1172-input-forma-ai" because no identity-based policy allows the s3:CreateBucket action,Source=AWSSDK.Core,''Type=Amazon.Runtime.Internal.HttpErrorResponseException,Message=The remote server returned an error: (403) Forbidden.,Source=AWSSDK.Core,''Type=System.Net.WebException,Message=The remote server returned an error: (403) Forbidden.,Source=System,'
Hi @markwclancy,
Thank you for reaching out to Microsoft Fabric Community.
If the issue still persists, please consider raising a support ticket for further assistance. To raise a support ticket, kindly follow the steps outlined in the following guide:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
Thanks and regards,
Anjan Kumar Chippa
Thanks for the response. Those suggestions did not work, but we did get a successful transfer when we added createBucket to the policy. I suspect there's a bug in Fabric where it's trying to check for createBucket permissions even for copy jobs that don't need to create a bucket. For others with the same issue, our steps were:
- we used the default URL: https://s3.amazonaws.com
- our bucket policy was as follows:
"Sid": "Statement1",
"Effect": "Allow",
"Action": [
"s3:GetBucketLocation",
"s3:GetBucketVersioning",
"s3:ListBucketMultipartUploads",
"s3:GetObject*",
"s3:ListBucket",
"s3:PutObject*",
"s3:AbortMultipartUpload",
"s3:DeleteObject*",
"s3:CreateBucket"
],
"Resource": [
"*"
]
Hi @markwclancy ,
one idea you could try.
Create a lakehouse if you don't already have one and create a connection with a shortcut to your S3 bucket.
Is access then possible as desired?
According to your authorizations, this should work.
Best regards
Hi @ljois,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa