Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Ask the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.

Reply
diveshwadhwa
New Member

Copying multiple Files from Azure Blob to Lakehouse

Hi Team,

I am encountering an issue while attempting to copy multiple files from an Azure Blob container to a Lakehouse using the wildcard file path and filepath option. The process is resulting in the following error message:

"ErrorCode=CopyCommandFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException, Message=Copy operation failed with an internal error, Source=Microsoft.DataTransfer.ImportCommand.AzCopy'."

Any suggestions or guidance from the team would be greatly appreciated.

Thank you,
Divesh Wadhwa

1 ACCEPTED SOLUTION

@diveshwadhwa - I am not sure if this translates on the Pipeline side but when I do the wildcard paths in Notebooks (whether connecting to ADLS Gen2, Fabric Lakehouse, etc). I usually still had to provide the full wildcard path to get the files:

 

If my files weres stored in the following hierarchy:   Files/CRM/Sales/*.json  my wildcard path was   '/Files/*/*/*.json'

 

So in Pipelines you may need to make your path a dynamic parameter and set it to:   "/Excel/*/*/*.extension"

 

Jeremy

View solution in original post

3 REPLIES 3
diveshwadhwa
New Member

When I send individual files, the process completes successfully with no errors, and I can see the file in the Lakehouse as expected.

However, when I attempt to send an entire folder using both a wildcard and file path as the path, I receive an error. I have attached a picture for your reference that shows the setting for azure data factory pipeline.

Could you please advise on the correct method to send multiple files to the Lakehouse, particularly when using wildcards and file paths?

Thank you in advance for your assistance.

Best regards,

Divesh WadhwaPipeline_error.jpg

@diveshwadhwa - I am not sure if this translates on the Pipeline side but when I do the wildcard paths in Notebooks (whether connecting to ADLS Gen2, Fabric Lakehouse, etc). I usually still had to provide the full wildcard path to get the files:

 

If my files weres stored in the following hierarchy:   Files/CRM/Sales/*.json  my wildcard path was   '/Files/*/*/*.json'

 

So in Pipelines you may need to make your path a dynamic parameter and set it to:   "/Excel/*/*/*.extension"

 

Jeremy

Anonymous
Not applicable

Hi @diveshwadhwa ,

 

Incorrect file paths, wildcard formatting, and insufficient permissions can all lead to this error, so see if the following documentation will help you.

Copy from Azure Blob Storage to Lakehouse - Microsoft Fabric | Microsoft Learn

 

Best Regards,
Adamk Kong

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.