The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
What are the best practices for handling large file transfers in Azure Data Factory, and how can I optimize performance for big data workflows?
Solved! Go to Solution.
If you are referring to Fabric Data Factory:
Try using Data Pipeline Copy Activity, or Dataflow Gen2 Fast Copy.
(If you are referring to Azure Data Factory: that's another forum.)
You could potentially look into Notebooks as well.
What kind of data source are you going to read from?
Hi @rahul_rajesh ,
If it's in Fabric Data Factory, you can test as frithjof_v said.
For your second question, to optimize performance for big data workflows in Azure Data Factory.
I think you can check out this official document below:
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @rahul_rajesh ,
If it's in Fabric Data Factory, you can test as frithjof_v said.
For your second question, to optimize performance for big data workflows in Azure Data Factory.
I think you can check out this official document below:
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If you are referring to Fabric Data Factory:
Try using Data Pipeline Copy Activity, or Dataflow Gen2 Fast Copy.
(If you are referring to Azure Data Factory: that's another forum.)
You could potentially look into Notebooks as well.
What kind of data source are you going to read from?