Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi all,
Is there a way to bulk copy files (namely excel files with different schemas) into lakehouse using Dataflows? For example I have an Ingest folder and excel files with different schemas are periodically added to it. Are we able to schedule a bulk copy of all the files to lakehouse and store them as lakehouse tables while being able to capture the new schema details? Perhaps a Pipeline is needed for this operation?
Another approach is to somehow ingest the raw excel files then use a Notebook to loop through them and infer schema from each of them then load them into the lakehouse. Not sure if this is feasible either.
Thanks
Solved! Go to Solution.
As a general rule, if you want to move files from location A to location B, a copy job or a pipeline would be the best ways to do so.
Dataflow Gen2 is only able to load data to SharePoint as a CSV; meaning that it can't truly copy an xlsx file from one place to another and we wouldn't recommend using Dataflow Gen2 for copying data.
I don't believe that a SharePoint files connector exists today in pipelines / copy job, and perhaps shortcuts could be an option in the future. I'd definitely encourage you to post the idea in https://aka.ms/FabricIdeas
Alternatively, if copying data is not a hard requirement you could simply leverage Dataflow Gen2 to connect to the files, transform them and then load them to a table in a Lakehouse.
Hope this helps!
As a general rule, if you want to move files from location A to location B, a copy job or a pipeline would be the best ways to do so.
Dataflow Gen2 is only able to load data to SharePoint as a CSV; meaning that it can't truly copy an xlsx file from one place to another and we wouldn't recommend using Dataflow Gen2 for copying data.
I don't believe that a SharePoint files connector exists today in pipelines / copy job, and perhaps shortcuts could be an option in the future. I'd definitely encourage you to post the idea in https://aka.ms/FabricIdeas
Alternatively, if copying data is not a hard requirement you could simply leverage Dataflow Gen2 to connect to the files, transform them and then load them to a table in a Lakehouse.
Hope this helps!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
4 | |
3 | |
2 | |
1 | |
1 |