Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi Team,
I am encountering an issue while attempting to copy multiple files from an Azure Blob container to a Lakehouse using the wildcard file path and filepath option. The process is resulting in the following error message:
"ErrorCode=CopyCommandFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException, Message=Copy operation failed with an internal error, Source=Microsoft.DataTransfer.ImportCommand.AzCopy'."
Any suggestions or guidance from the team would be greatly appreciated.
Thank you,
Divesh Wadhwa
Solved! Go to Solution.
@diveshwadhwa - I am not sure if this translates on the Pipeline side but when I do the wildcard paths in Notebooks (whether connecting to ADLS Gen2, Fabric Lakehouse, etc). I usually still had to provide the full wildcard path to get the files:
If my files weres stored in the following hierarchy: Files/CRM/Sales/*.json my wildcard path was '/Files/*/*/*.json'
So in Pipelines you may need to make your path a dynamic parameter and set it to: "/Excel/*/*/*.extension"
Jeremy
When I send individual files, the process completes successfully with no errors, and I can see the file in the Lakehouse as expected.
However, when I attempt to send an entire folder using both a wildcard and file path as the path, I receive an error. I have attached a picture for your reference that shows the setting for azure data factory pipeline.
Could you please advise on the correct method to send multiple files to the Lakehouse, particularly when using wildcards and file paths?
Thank you in advance for your assistance.
Best regards,
Divesh Wadhwa
@diveshwadhwa - I am not sure if this translates on the Pipeline side but when I do the wildcard paths in Notebooks (whether connecting to ADLS Gen2, Fabric Lakehouse, etc). I usually still had to provide the full wildcard path to get the files:
If my files weres stored in the following hierarchy: Files/CRM/Sales/*.json my wildcard path was '/Files/*/*/*.json'
So in Pipelines you may need to make your path a dynamic parameter and set it to: "/Excel/*/*/*.extension"
Jeremy
Hi @diveshwadhwa ,
Incorrect file paths, wildcard formatting, and insufficient permissions can all lead to this error, so see if the following documentation will help you.
Copy from Azure Blob Storage to Lakehouse - Microsoft Fabric | Microsoft Learn
Best Regards,
Adamk Kong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
7 | |
3 | |
2 | |
2 | |
1 |
User | Count |
---|---|
10 | |
9 | |
5 | |
3 | |
3 |