Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello community,
I am new to Fabric and what I thought was a simple task turns out to be quite complicated.
I need to process files in a Sharepoint Online folder, transform the data of each file using PowerQuery in a Dataflow (I believe), store the transformed data in a DB and after the successful import to the DB the file should be moved to an Archive folder of the same Sharepoint library.
I followed this guide to set up the connection to Sharepoint and retrieve the files via a pipeline:
https://www.syntera.ch/blog/2022/10/10/copy-files-from-sharepoint-to-blob-storage-using-azure-data-f...
The part I cannot follow anymore is everything after "6. Iterate through list of files and copy each file to ADLS".
Is there no easier way to simply access the file and then transform it?
As a test I already created a dataflow Gen2 which works but I need to handle each file separately as well as move the processed files to an Archive folder.
What options do I have?
Thank you very much.
Hello @tonyclifton
Data pipeline would be a simple solution as @lbendlin suggested.
Create a new data pipeline in Microsoft Fabric
Add a Lookup activity to get a list of files from your SharePoint folder
Follow this with a ForEach activity to iterate through each file
Inside the ForEach loop, add:
A Dataflow activity to transform the data
A Copy Data activity to load data into your database
A procedure or activity to move the file to an archive folder
Hello @nilendraFabric
thanks for your process structure. The part I struggle with starts in your listing at number 4.
So I have the foreach loop and get the binary file content in a Web activity like so:
/_api/web/getfilebyserverrelativeurl('/sites/DataTest/@{item().Name}')/$value
What I don't undertand is how to then pass this binary data to the Dataflow:
What do I put as "Dataflow ID"? My guess was it needs to be the output of the previous step:
But by filling that Dataflow ID I can't even open the dataflow. Or do I need to create this dataflow differently and later on pass the output to the dataflow?
Thank you.
@nilendraFabric I am not sure whether the lookup activity supports Sharepoint folders. You might need to use a web activity to get the list of all files within that folder.
Also as of today, based on my understanding, Dataflow inputs cannot be parameterized and they need to be hardcoded. So in case if one uses DF, they would have to set up multiple DFs for diff files and transformations.
So the flow can be as below :
Within Data pipeline :
a) web activity to get list of files
b) copy files into a lakehouse/database as is
c) do transformations via SQL
d) move the file to another location within lakehouse
and after the successful import to the DB the file should be moved to an Archive folder of the same Sharepoint library
That is not something you can do in a dataflow. Look at pipelines.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
2 | |
2 | |
2 | |
2 | |
2 |