Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
rahul_rajesh
New Member

Design Question

What are the best practices for handling large file transfers in Azure Data Factory, and how can I optimize performance for big data workflows?

2 ACCEPTED SOLUTIONS
frithjof_v
Community Champion
Community Champion

If you are referring to Fabric Data Factory:

 

Try using Data Pipeline Copy Activity, or Dataflow Gen2 Fast Copy.

 

(If you are referring to Azure Data Factory: that's another forum.)

 

You could potentially look into Notebooks as well.

 

What kind of data source are you going to read from?

View solution in original post

Anonymous
Not applicable

Hi @rahul_rajesh ,

If it's in Fabric Data Factory, you can test as frithjof_v said.

 

For your second question, to optimize performance for big data workflows in Azure Data Factory.

I think you can check out this official document below:

Mapping data flow performance and tuning guide - Azure Data Factory & Azure Synapse | Microsoft Lear...

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Hi @rahul_rajesh ,

If it's in Fabric Data Factory, you can test as frithjof_v said.

 

For your second question, to optimize performance for big data workflows in Azure Data Factory.

I think you can check out this official document below:

Mapping data flow performance and tuning guide - Azure Data Factory & Azure Synapse | Microsoft Lear...

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

frithjof_v
Community Champion
Community Champion

If you are referring to Fabric Data Factory:

 

Try using Data Pipeline Copy Activity, or Dataflow Gen2 Fast Copy.

 

(If you are referring to Azure Data Factory: that's another forum.)

 

You could potentially look into Notebooks as well.

 

What kind of data source are you going to read from?

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.

Top Solution Authors
Top Kudoed Authors