Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I have the pipeline belwo which works great. It works for a single process and uses paths etc for this process.
However, I need to use this pipeline for five processes, (in which some settings are different every run, like the path it takes files from, the file type, and the destination tables).
So my solution:
Add a lookup activity which looks up a json file stored in my lakehouse. This json file has all the settings for each of the processes:
Then: I would copy the pipeline mentioned earlier as a whole in a ForEach activity, so the correct settings are applied on each run. This would like like the below:
However, the problem is that in data factory, it is not possible to use a ForEach activity within a ForEach activity.... And I do really need these ForEach activity's to copy multiple files etc.
So my question is: how to do this in an elegant, dynamic way?
A solution would be to copy the whole pipeline, paste it 5 times under each other. But then if I need to make a change i need to do it 5 times.
Solved! Go to Solution.
you can trigger an Execute pipeline activity from within a for each activity.
So you can have a pipeline with your main framework and call that pipeline from within foreach activity in another pipeline
you can trigger an Execute pipeline activity from within a for each activity.
So you can have a pipeline with your main framework and call that pipeline from within foreach activity in another pipeline