Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
Hey Team,
Just wanted to check if any of you have encountered this error while copying the data from SFTP into fabric lakehouse(files)
ErrorCode=UserErrorWriteFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path: ''.,Source=Microsoft.DataTransfer.Common,''Type=Renci.SshNet.Common.SshOperationTimeoutException,Message=Socket read operation has timed out after 30000 milliseconds.,Source=Renci.SshNet,'
Any help would be appreciated. Thanks!
Solved! Go to Solution.
Hi @rsheltondavid,
Thank you for reaching out to the Microsoft Fabric Community Forum.
This error “Socket read operation has timed out after 30000 milliseconds” means the SFTP server did not respond within the expected time while Fabric (or Data Factory) was trying to read/write files. It’s typically caused by network latency, server slowness, or a firewall/VPN dropping the connection not by a Fabric service issue.
Test SFTP connectivity from the Integration Runtime (IR) machine using:
sftp -P 22 username@sftp-host
If it’s slow or times out, it’s a network/server issue.
Check which IR you’re using: If you’re using Azure Integration Runtime, ensure your SFTP server allows connections from the Azure region’s outbound IP addresses. If your SFTP server is on-premises or behind a firewall, try using a Self-hosted Integration Runtime (SHIR) on the same network to avoid public network latency.
Increase the timeout in your Copy activity: Go to your Copy Data activity → SFTP settings → Advanced. Increase the “Operation timeout (minutes)” value to a higher number (e.g., 10 or 15 minutes). Optionally, disable “Upload with temp file (rename)” if your SFTP server doesn’t support file renaming properly.
In Performance settings, lower the degree of copy parallelism to reduce pressure on the SFTP server. Set Retry count (for example, 3) and Retry interval (for example, 30 seconds) in the activity settings this helps recover from transient timeouts automatically. Ask your SFTP admin to check server logs around the failure timestamp it can reveal whether the connection was dropped, queued, or refused.
This timeout means your SFTP server did not respond quickly enough to the Fabric Data Pipeline’s request. Start by confirming connectivity from your Integration Runtime, then increase the operation timeout and retry settings those steps resolve this issue in most cases.
Refer these links:
1. https://learn.microsoft.com/en-us/azure/data-factory/connector-troubleshoot-ftp-sftp-http?
2. https://learn.microsoft.com/en-gb/azure/data-factory/connector-sftp?tabs=data-factory
3. https://learn.microsoft.com/en-gb/azure/data-factory/concepts-integration-runtime 4. https://learn.microsoft.com/en-gb/azure/data-factory/copy-activity-performance
Hope this clears it up. Let us know if you have any doubts regarding this. We will be happy to help.
Thank you for using the Microsoft Fabric Community Forum.
Hi @rsheltondavid,
Thank you for reaching out to the Microsoft Fabric Community Forum.
This error “Socket read operation has timed out after 30000 milliseconds” means the SFTP server did not respond within the expected time while Fabric (or Data Factory) was trying to read/write files. It’s typically caused by network latency, server slowness, or a firewall/VPN dropping the connection not by a Fabric service issue.
Test SFTP connectivity from the Integration Runtime (IR) machine using:
sftp -P 22 username@sftp-host
If it’s slow or times out, it’s a network/server issue.
Check which IR you’re using: If you’re using Azure Integration Runtime, ensure your SFTP server allows connections from the Azure region’s outbound IP addresses. If your SFTP server is on-premises or behind a firewall, try using a Self-hosted Integration Runtime (SHIR) on the same network to avoid public network latency.
Increase the timeout in your Copy activity: Go to your Copy Data activity → SFTP settings → Advanced. Increase the “Operation timeout (minutes)” value to a higher number (e.g., 10 or 15 minutes). Optionally, disable “Upload with temp file (rename)” if your SFTP server doesn’t support file renaming properly.
In Performance settings, lower the degree of copy parallelism to reduce pressure on the SFTP server. Set Retry count (for example, 3) and Retry interval (for example, 30 seconds) in the activity settings this helps recover from transient timeouts automatically. Ask your SFTP admin to check server logs around the failure timestamp it can reveal whether the connection was dropped, queued, or refused.
This timeout means your SFTP server did not respond quickly enough to the Fabric Data Pipeline’s request. Start by confirming connectivity from your Integration Runtime, then increase the operation timeout and retry settings those steps resolve this issue in most cases.
Refer these links:
1. https://learn.microsoft.com/en-us/azure/data-factory/connector-troubleshoot-ftp-sftp-http?
2. https://learn.microsoft.com/en-gb/azure/data-factory/connector-sftp?tabs=data-factory
3. https://learn.microsoft.com/en-gb/azure/data-factory/concepts-integration-runtime 4. https://learn.microsoft.com/en-gb/azure/data-factory/copy-activity-performance
Hope this clears it up. Let us know if you have any doubts regarding this. We will be happy to help.
Thank you for using the Microsoft Fabric Community Forum.
Hi @rsheltondavid,
Just checking in to see if the issue has been resolved on your end. If the earlier suggestions helped, that’s great to hear! And if you’re still facing challenges, feel free to share more details happy to assist further.
Thank you.
Hi @Poojajunnu,
Just wanted to follow up. If the shared guidance worked for you, that’s wonderful hopefully it also helps others looking for similar answers. If there’s anything else you'd like to explore or clarify, don’t hesitate to reach out.
Thank you.
Hi @rsheltondavid,
Just wanted to follow up. If the shared guidance worked for you, that’s wonderful hopefully it also helps others looking for similar answers. If there’s anything else you'd like to explore or clarify, don’t hesitate to reach out.
Thank you.