Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
MelisandeRicou
New Member

Shortcut to AWS S3 - AWS Data Transfer Costs

Hi,

I am investigating the option to create a Fabric shortcut to a AWS S3 standard bucket.
The bucket contains about 200 GB of data in parquet files, that are refreshed daily.
In Fabric, this data would need to be accessed and refreshed daily to update semantic models.

I understand that AWS egress fees/Data Transfer Costs apply when data is transferred out of AWS, and could be significant.

I am trying to determine what the AWS Data Transfer costs would be to read 200GB of data from that S3 bucket, via a shortcut, on a daily basis.
Does the shortcut technology allows for a significant reduction of these costs? If so, how?

4 REPLIES 4
v-prasare
Community Support
Community Support

We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.

 

 

 

 

Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support

v-prasare
Community Support
Community Support

We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.

 

@Rufyda, Thanks for your propt response

 

 

 

Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support

v-prasare
Community Support
Community Support

Hi @MelisandeRicou,

 

  • A shortcut does not copy data, it streams it from S3.

  • On-demand reads trigger AWS S3 egress (billed as “DataTransfer-Out-Bytes”).

  • Caching in Fabric can reduce repeated egress

 

 

 

Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support

Rufyda
Kudo Kingpin
Kudo Kingpin

Hi @MelisandeRicou 

Copy the data once to OneLake / Azure.

After that, use incremental loads to transfer only daily changes instead of the full 200 GB.

Partition your data so Fabric reads only the required portion.

For permanent, heavy workloads, move the dataset entirely to Azure to avoid ongoing egress costs.

If this helps, consider giving some Kudos. If I answered your question, please mark this as the solution.



Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors