March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Trying to copy data from Onprem SQL Server to Fabric Warehouse and getting this error:
ErrorCode=DWCopyCommandOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message='DataWarehouse' Copy Command operation failed with error ''Content of directory on path 'ADLS Gen2 Endpoint' cannot be listed.'.,Source=Microsoft.DataTransfer.Connectors.MSSQLImport,''Type=Microsoft.Data.SqlClient.SqlException,Message=Content of directory on path 'ADLS Gen2 Endpoint' cannot be listed.,Source=Framework Microsoft SqlClient Data Provider,'
When we set the access on our ADLS to 'Enabled from all networks', this process works fine but we have it currently set to 'Enabled from selected virtual networks and IP addresses'. We have added both the source and desitnation VNets to the enabled list.
This is our Staging Account connection setup:
Solved! Go to Solution.
Hi @Anonymous ,
I think you can also try the steps below:
1. Ensure that the NSG is properly configured to allow traffic from the specified IP address.
2. Double-check the firewall rules on the storage account to ensure that they are not inadvertently blocking access.
3. Verify that the DNS settings for the dedicated links are properly configured and resolve to the appropriate dedicated endpoints.
4. Ensure that the dedicated endpoints are associated with the correct dedicated DNS zones.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi v-yilong-msft, have added all the IPV4's to the storage account, access is both Storage Account Contributor as well as Storage Blob Contributor. We have also enabled private link but did not block public access.
What else can we try?
Hi @Anonymous ,
I think you can also try the steps below:
1. Ensure that the NSG is properly configured to allow traffic from the specified IP address.
2. Double-check the firewall rules on the storage account to ensure that they are not inadvertently blocking access.
3. Verify that the DNS settings for the dedicated links are properly configured and resolve to the appropriate dedicated endpoints.
4. Ensure that the dedicated endpoints are associated with the correct dedicated DNS zones.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Done all of that and still not resolved. Ended up opening up the storage account to all networks. Note that pipelines doesn't allow Service Principal authentication, only SAS, anonymous or Key.
I haven't set this up before, so I'm just guessing.
To my unexperienced eyes, it looks like that is supposed to be the connection to the source, which is your on-prem SQL Server, but it looks like the values are for an Azure Data Lake Gen 2.
It is correct as it is, you only have the option to set it to ADLS or Blob. I'm assuming the issue is related to firewall or network setting but no idea what it could be.
Hi @Anonymous ,
This error message indicates that an attempt to list directory contents from Azure Data Lake Storage Gen2 failed. Specifically, the error code DWCopyCommandOperationFailed indicates that a problem occurred while executing the Data Warehouse Copy command to list the contents of the catalog on the specified path.
So I think there are two things you can do to try to find the cause and fix the problem:
1. As you mentioned above, if your ADLS Gen2 is using private endpoints or firewall rules, make sure your network configuration allows access. You may need to check your firewall settings to ensure that the IP address range from the service in question is allowed.
2. Ensure that the service principal or hosting identity used to access ADLS Gen2 has sufficient privileges. You can check and ensure that these identities have Storage Blob Data Reader or higher role permissions.
You can also read the following topic for more information: unable to connect adls gen storage from Purview - Microsoft Q&A
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi, so where would I find the IP address range that Fabric pipelines use?
Hi @Anonymous ,
You can find the IP address ranges used by Fabric pipelines on the Azure Integration Runtime IP addresses page. This page provides the necessary IP ranges for data movement, pipeline, and external activities executions.
If you need to allow-list these IP addresses for security purposes, you can use the ranges specified there. So you can read this document for a further study: Azure Integration Runtime IP addresses - Azure Data Factory | Microsoft Learn
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
7 | |
4 | |
2 | |
2 | |
2 |
User | Count |
---|---|
15 | |
10 | |
7 | |
5 | |
4 |