Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Gaurav-J
Regular Visitor

Fabric Data Factory pipeline Copy activity failed: MySQL tables upload operation unsuccessful.

Hi,
Whenever we try to load tables with a high row count from our on-prem MySQL databases using the Copy Data activity in Fabric Data Factory, we can see the data in the Data Preview option. However, the loading process fails with an error. We have checked the MySQL connections and other configurations but have not been able to find a solution. 

Could anyone please advise how to resolve this issue?


Failure happened on 'Source' side. ErrorCode=UserErrorWriteFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path: 'fb0a7fd8-623f-447f-8a0f-0cba6ece7e01/1e8d11a2-e343-4c3f-b75e-d5cbbc880711/Staging/eae1ef39-946d-4531-a18f-25bb2736082d/MSSQLImportCommand/`resource_master`.parquet'.,Source=Microsoft.DataTransfer.Common,''Type=System.Net.Sockets.SocketException,Message=An existing connection was forcibly closed by the remote host,Source=MySqlConnector,' 

2 ACCEPTED SOLUTIONS

All Right.

 

If I were you, I would do the following. You load your data in blocks until the table is completely loaded.
This should be relatively easy to implement.
I know it's not ideal, but at least it would be your solution.

This will allow you to bypass any network issues.

 

However, sometimes even an error message is not always accurate.

Try testing a different destination to rule out the Fabric Warehouse. 



View solution in original post

v-dineshya
Community Support
Community Support

Hi @Gaurav-J ,

Thank you for reaching out to the Microsoft Community Forum.

 

The error you are facing in Fabric Data Factory when copying large MySQL tables to Fabric Data Warehouse, MySQL server is terminating the connection.

 

Please try below things to fix the issue.

 

1. You mentioned that 9,000 rows batches work, By Using SQL queries with LIMIT and OFFSET in the source configuration. Implement pagination logic in the pipeline to loop through batches.

 

2. In MySQL Server Settings, Check and increase below things.

 

net_read_timeout
net_write_timeout
max_allowed_packet (set to at least 64MB or higher)
wait_timeout and interactive_timeout

 

Note: These settings help prevent premature connection drops during large transfers.

 

3. Instead of selecting the table directly, use a custom SQL query like

 

SELECT * FROM table_name LIMIT 9000 OFFSET 0

 

Note: This gives more control and avoids full table scans.

 

4. In the Copy Activity settings, Enable compression to reduce payload size. Use parallel copy with multiple threads if supported.

 

5. Use Self-hosted Integration Runtime if you are accessing on-prem MySQL.

 

6. Check the staging storage account has sufficient permissions and performance. Try to switch to another staging location temporarily to isolate the issue.

 

7. In Copy Activity, break the pipeline into multiple smaller activities. Use ForEach loop with dynamic ranges.

 

Please refer below link.

Configure MySQL in a copy activity - Microsoft Fabric | Microsoft Learn

 

I hope this information helps. Please do let us know if you have any further queries.

 

Regards,

Dinesh

View solution in original post

8 REPLIES 8
v-dineshya
Community Support
Community Support

Hi @Gaurav-J ,

Thank you for reaching out to the Microsoft Community Forum.

 

The error you are facing in Fabric Data Factory when copying large MySQL tables to Fabric Data Warehouse, MySQL server is terminating the connection.

 

Please try below things to fix the issue.

 

1. You mentioned that 9,000 rows batches work, By Using SQL queries with LIMIT and OFFSET in the source configuration. Implement pagination logic in the pipeline to loop through batches.

 

2. In MySQL Server Settings, Check and increase below things.

 

net_read_timeout
net_write_timeout
max_allowed_packet (set to at least 64MB or higher)
wait_timeout and interactive_timeout

 

Note: These settings help prevent premature connection drops during large transfers.

 

3. Instead of selecting the table directly, use a custom SQL query like

 

SELECT * FROM table_name LIMIT 9000 OFFSET 0

 

Note: This gives more control and avoids full table scans.

 

4. In the Copy Activity settings, Enable compression to reduce payload size. Use parallel copy with multiple threads if supported.

 

5. Use Self-hosted Integration Runtime if you are accessing on-prem MySQL.

 

6. Check the staging storage account has sufficient permissions and performance. Try to switch to another staging location temporarily to isolate the issue.

 

7. In Copy Activity, break the pipeline into multiple smaller activities. Use ForEach loop with dynamic ranges.

 

Please refer below link.

Configure MySQL in a copy activity - Microsoft Fabric | Microsoft Learn

 

I hope this information helps. Please do let us know if you have any further queries.

 

Regards,

Dinesh

Hi @Gaurav-J ,

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.

 

Regards,

Dinesh

Hi @Gaurav-J ,

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. And, if you have any further query do let us know.

 

Regards,

Dinesh

spaceman127
Resolver III
Resolver III

Hi @Gaurav-J ,

 

does the copying stop immediately or does it take a while?

And what is your destination? (Lakehouse, Warehouse, etc.)

I would do the following. The Copy activity offers a number of options, such as staging. You could test that out.
Then you could reduce the number of rows that are copied with an SQL statement to test whether that works.

 

Here you can find a documentation about Copy Activity.

 

https://learn.microsoft.com/en-us/fabric/data-factory/copy-activity-performance-and-scalability-guid...

 

And here is another documentation.

 

https://learn.microsoft.com/en-us/fabric/data-factory/decision-guide-data-movement

 

Best regards

@spaceman127 

My target platform is Fabric Data Warehouse. The process fails in 20-30 seconds usually.One of the tables I am migrating contains only about 20,000 records, yet the process fails with an error. The Enable Staging option is already turned on. I have observed that the migration works if I load the data in batches of 9,000 records.

Is there a way to migrate the entire table in a single run without encountering this error? I would prefer not to use Dataflow Gen2 for this solution.

Any guidance or suggestions to resolve this issue would be greatly appreciated.

 

@Gaurav-J ,

 

All right,

20,000 lines isn't really that much.
Dataflow Gen2, as you say, would be one option. Another option would be the copy job. Have you checked that too?

Have you also checked the connection speed? It seems that your network connection is too slow for copying.

 

Best regards

 

@spaceman127 Yes I have also tried copy job. I have also tried to do the same process with three different networks. Still I am not able to copy the tables from MySQL database. 

 

Any guidance or suggestions to resolve this issue would be greatly appreciated.

All Right.

 

If I were you, I would do the following. You load your data in blocks until the table is completely loaded.
This should be relatively easy to implement.
I know it's not ideal, but at least it would be your solution.

This will allow you to bypass any network issues.

 

However, sometimes even an error message is not always accurate.

Try testing a different destination to rule out the Fabric Warehouse. 



Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.