Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
ljois
New Member

Copy from lakehouse to S3 fails with the AWS error Do not have permissions to create bucket

I am trying to output a csv file from my lakehouse to S3 using copyJob activity ( I have also tried pipeline) but get the same error. I get the erorr suggesting that the AWS user (based on key and Secret) do not have permissions to create bucket. I am not trying to create the bucket but trying to put files into the bucket. This S3 is from my customer and i cannot get create bucket permissions..  The user has listBuckets, listBucket, get and put permisisons.

1 ACCEPTED SOLUTION

Hi @ljois,

 

Thank you for the response and confirmation. Since the regional endpoint fix is now correct, so the issue is likely because of missing multipart upload permissions. Even for small files the Fabric/ADF may initiate multipart APIs in the background. Please ask the bucket owner to add the following object-level permissions(if not already present):

  • s3:PutObject
  • s3:AbortMultipartUpload
  • s3:ListMultipartUploadParts

And also additionally confirm that your IAM user has s3:HeadBucket at the bucket level. This is the specific API used by Fabric to validate the bucket before writing, and without it the service may misinterpret the failure as a like the create bucket permission issue.


If it still fails after adding this, please check the bucket policy(not just the IAM policy). The bucket policy may explicitly deny cross-account access unless certain conditions are met.
And check the ownership/ACL settings, for example if the bucket is in a different AWS account, check and confirm whether Object Ownership = Bucket owner enforced is enabled.

 

Thanks and regards,

Anjan Kumar Chippa

View solution in original post

11 REPLIES 11
ljois
New Member

I have tried the regional endpoint and it still fails with the same reason.  Apart from multipart permissions all of them are there, the files are very small so i dont know if that could be the problem. I will ask to add permissions for multipart and see

Hi @ljois ,

So if you only need explicit permissions to read the bucket, the following are sufficient.
As far as I understand, this is only about reading.

 

- s3:ListAllMyBuckets

- s3:GetBucketLocation
- s3:ListBucket

- s3:GetObject

We use these permissions for our S3 buckets, and they work.

 

Best regards

 

 

 

 

Hi @ljois,

 

Thank you for the response and confirmation. Since the regional endpoint fix is now correct, so the issue is likely because of missing multipart upload permissions. Even for small files the Fabric/ADF may initiate multipart APIs in the background. Please ask the bucket owner to add the following object-level permissions(if not already present):

  • s3:PutObject
  • s3:AbortMultipartUpload
  • s3:ListMultipartUploadParts

And also additionally confirm that your IAM user has s3:HeadBucket at the bucket level. This is the specific API used by Fabric to validate the bucket before writing, and without it the service may misinterpret the failure as a like the create bucket permission issue.


If it still fails after adding this, please check the bucket policy(not just the IAM policy). The bucket policy may explicitly deny cross-account access unless certain conditions are met.
And check the ownership/ACL settings, for example if the bucket is in a different AWS account, check and confirm whether Object Ownership = Bucket owner enforced is enabled.

 

Thanks and regards,

Anjan Kumar Chippa

Hi @ljois,

 

We wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

Hi @ljois,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

v-achippa
Community Support
Community Support

Hi @ljois,

 

Thank you for reaching out to Microsoft Fabric Community.

 

The error here is not because fabric is really trying to create a new bucket, but it happens when the S3 connection is pointing at the wrong endpoint/region. Please follow below steps:

  • In Fabric, make sure the connection type is Amazon S3 (not S3 Compatible).
  • Edit your Amazon S3 connection and set Url to the regional endpoint for the customer’s bucket.
  • Ask the bucket owner and confirm the correct region and use that region for the connection’s endpoint.
  • Also large files are uploaded via multipart upload, so some other permissions are required by Fabric/ADF both during connection test and copy execution. Please ask the bucket owner to extend the IAM policy to include at least these below actions:
    • s3:ListBucket
    • s3:GetBucketLocation
    • s3:HeadBucket
    • s3:ListBucketMultipartUploads
  • If it is still failing add these also for the object-level writes:
    s3:PutObject, s3:AbortMultipartUpload, s3:ListMultipartUploadParts

With the correct regional endpoint and these permissions, the copy job will succeed without needing CreateBucket rights.

 

 

Thanks and regards,

Anjan Kumar Chippa

Hi Anjan - Similar issue hear and we've followed the advice provided with no success. See error and steps taken below:

 
  • We have successfully connected to S3 using both https://s3.awsamazon.com and https://s3.us-east-1.amazonaws.com
  • We can browse the S3 buckets from within Fabric (see below) and are able to select the target S3 bucket --- 1172-input-forma-ai
  • It's strange that Fabric appears to be trying to create a bucket since it clearly knows that the bucket already exists.
  • The following permissions have been granted to the role for the 1172-input-forma-ai bucket:

    • "s3:ListAllMyBuckets"
    • "s3:GetBucketLocation",
    • "s3:GetBucketVersioning",
    • "s3:ListBucketMultipartUploads",
    • "s3:GetObject*",
    • "s3:ListBucket",
    • "s3:HeadBucket"
    • "s3:PutObject*",
    • "s3:AbortMultipartUpload",
    • "s3:DeleteObject*"
  • We have tried explicitly stating the "Directory" and "File Name" and have also tried without either.
  • We have examined the logs in AWS and it also shows the attempt to create a bucket.
 
Specific error message listed below.
 
Thanks!
 
 
More Details:
 

ErrorCode=UserErrorWriteFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path: '1172-input-forma-ai/silver_dates.txt'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Amazon.S3.AmazonS3Exception,Message=User: arn:aws:iam::900632042242:user/1172_access is not authorized to perform: s3:CreateBucket on resource: "arn:aws:s3:::1172-input-forma-ai" because no identity-based policy allows the s3:CreateBucket action,Source=AWSSDK.Core,''Type=Amazon.Runtime.Internal.HttpErrorResponseException,Message=The remote server returned an error: (403) Forbidden.,Source=AWSSDK.Core,''Type=System.Net.WebException,Message=The remote server returned an error: (403) Forbidden.,Source=System,'

 
 
markwclancy_0-1757021917181.png

 

 

Hi @markwclancy,

 

Thank you for reaching out to Microsoft Fabric Community.

 

  • Use the correct connection type and endpoint, the connection type must be Amazon S3 not S3 compatible. In the Fabric connection set Url to the regional endpoint of the bucket. In the copy activity set the bucket name to exactly your bucket name (for example 1172-input-forma-ai).
  • Only use Directory and File name for prefixes and objects, do not put the bucket name there.
  • Confirm the required permissions and check the bucket policy and ownership.


If the issue still persists, please consider raising a support ticket for further assistance. To raise a support ticket, kindly follow the steps outlined in the following guide:

How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

 

Thanks and regards,

Anjan Kumar Chippa

Thanks for the response. Those suggestions did not work, but we did get a successful transfer when we added createBucket to the policy. I suspect there's a bug in Fabric where it's trying to check for createBucket permissions even for copy jobs that don't need to create a bucket. For others with the same issue, our steps were:
- we used the default URL: https://s3.amazonaws.com
- our bucket policy was as follows:
"Sid": "Statement1",
"Effect": "Allow",
"Action": [
"s3:GetBucketLocation",
"s3:GetBucketVersioning",
"s3:ListBucketMultipartUploads",
"s3:GetObject*",
"s3:ListBucket",
"s3:PutObject*",
"s3:AbortMultipartUpload",
"s3:DeleteObject*",
"s3:CreateBucket"
],
"Resource": [
"*"
]

Hi @markwclancy ,

 

one idea you could try.
Create a lakehouse if you don't already have one and create a connection with a shortcut to your S3 bucket.
Is access then possible as desired?
According to your authorizations, this should work.

 

Best regards

Hi @ljois,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors