Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
diveshwadhwa
New Member

Copying multiple Files from Azure Blob to Lakehouse

Hi Team,

I am encountering an issue while attempting to copy multiple files from an Azure Blob container to a Lakehouse using the wildcard file path and filepath option. The process is resulting in the following error message:

"ErrorCode=CopyCommandFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException, Message=Copy operation failed with an internal error, Source=Microsoft.DataTransfer.ImportCommand.AzCopy'."

Any suggestions or guidance from the team would be greatly appreciated.

Thank you,
Divesh Wadhwa

1 ACCEPTED SOLUTION

@diveshwadhwa - I am not sure if this translates on the Pipeline side but when I do the wildcard paths in Notebooks (whether connecting to ADLS Gen2, Fabric Lakehouse, etc). I usually still had to provide the full wildcard path to get the files:

 

If my files weres stored in the following hierarchy:   Files/CRM/Sales/*.json  my wildcard path was   '/Files/*/*/*.json'

 

So in Pipelines you may need to make your path a dynamic parameter and set it to:   "/Excel/*/*/*.extension"

 

Jeremy

View solution in original post

3 REPLIES 3
diveshwadhwa
New Member

When I send individual files, the process completes successfully with no errors, and I can see the file in the Lakehouse as expected.

However, when I attempt to send an entire folder using both a wildcard and file path as the path, I receive an error. I have attached a picture for your reference that shows the setting for azure data factory pipeline.

Could you please advise on the correct method to send multiple files to the Lakehouse, particularly when using wildcards and file paths?

Thank you in advance for your assistance.

Best regards,

Divesh WadhwaPipeline_error.jpg

@diveshwadhwa - I am not sure if this translates on the Pipeline side but when I do the wildcard paths in Notebooks (whether connecting to ADLS Gen2, Fabric Lakehouse, etc). I usually still had to provide the full wildcard path to get the files:

 

If my files weres stored in the following hierarchy:   Files/CRM/Sales/*.json  my wildcard path was   '/Files/*/*/*.json'

 

So in Pipelines you may need to make your path a dynamic parameter and set it to:   "/Excel/*/*/*.extension"

 

Jeremy

v-kongfanf-msft
Community Support
Community Support

Hi @diveshwadhwa ,

 

Incorrect file paths, wildcard formatting, and insufficient permissions can all lead to this error, so see if the following documentation will help you.

Copy from Azure Blob Storage to Lakehouse - Microsoft Fabric | Microsoft Learn

 

Best Regards,
Adamk Kong

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

JanFabricDE_carousel

Fabric Monthly Update - January 2025

Explore the power of Python Notebooks in Fabric!

JanFabricDW_carousel

Fabric Monthly Update - January 2025

Unlock the latest Fabric Data Warehouse upgrades!

JanFabricDF_carousel

Fabric Monthly Update - January 2025

Take your data replication to the next level with Fabric's latest updates!