Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
JiHin
Frequent Visitor

Bulk copy files/ excel files to lakehouse with Dataflow

Hi all,


Is there a way to bulk copy files (namely excel files with different schemas) into lakehouse using Dataflows? For example I have an Ingest folder and excel files with different schemas are periodically added to it. Are we able to schedule a bulk copy of all the files to lakehouse and store them as lakehouse tables while being able to capture the new schema details? Perhaps a Pipeline is needed for this operation? 

 

Another approach is to somehow ingest the raw excel files then use a Notebook to loop through them and infer schema from each of them then load them into the lakehouse. Not sure if this is feasible either.

 

Thanks

1 ACCEPTED SOLUTION
miguel
Community Admin
Community Admin

As a general rule, if you want to move files from location A to location B, a copy job or a pipeline would be the best ways to do so.

Dataflow Gen2 is only able to load data to SharePoint as a CSV; meaning that it can't truly copy an xlsx file from one place to another and we wouldn't recommend using Dataflow Gen2 for copying data.


I don't believe that a SharePoint files connector exists today in pipelines / copy job, and perhaps shortcuts could be an option in the future. I'd definitely encourage you to post the idea in https://aka.ms/FabricIdeas 

 

Alternatively, if copying data is not a hard requirement you could simply leverage Dataflow Gen2 to connect to the files, transform them and then load them to a table in a Lakehouse.

 

Hope this helps!

View solution in original post

1 REPLY 1
miguel
Community Admin
Community Admin

As a general rule, if you want to move files from location A to location B, a copy job or a pipeline would be the best ways to do so.

Dataflow Gen2 is only able to load data to SharePoint as a CSV; meaning that it can't truly copy an xlsx file from one place to another and we wouldn't recommend using Dataflow Gen2 for copying data.


I don't believe that a SharePoint files connector exists today in pipelines / copy job, and perhaps shortcuts could be an option in the future. I'd definitely encourage you to post the idea in https://aka.ms/FabricIdeas 

 

Alternatively, if copying data is not a hard requirement you could simply leverage Dataflow Gen2 to connect to the files, transform them and then load them to a table in a Lakehouse.

 

Hope this helps!

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors