Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi
Currently my pipelines on dev and prod have the same source. So basically my ingestion pipeline + EDW pipeline run daily. Reads same source twice. By prod and then by Dev.
In order to avoid double read of source twice one by dev and one by prod,
Is there a way to dynamically point sources based on flag on copy activity?
Below is diagram of existing set up and the 2nd part is thinking to break apart ingestion pipeline and dynamically decide where to pick up data from.
If not, is there any other way to achieve this?
Is there a better way to do this in fabric?
Activities used in pipeline.
Look up activity to fetch tablenames from reference table, - foreach loop with copy activity inside populating staging tables.
Solved! Go to Solution.
Hi @AJAJ ,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you .
Best Regards,
Community Support Team
@cengizhanarslan tagging you if you happen to have come across the above question.
Wondering if there is anyway to have 1 notebook copying data from different sources using foreach loop and pushing the data into different destinations or same destinations (refer to excel tab above which has source and destination). Some sources are ODBC, some sources are SQL server, some are SFTP folders. Just need to loop and copy based on the query. In future Col4 might be added to s1_tableA.
Importantly things should run in parallel across sources. sometimes run 10 parallel extraction within the same source.
Hello @AJAJ
Hi @AJAJ ,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you .
Best Regards,
Community Support Team
Hi @AJAJ ,
Thanks for reaching out to the Microsoft fabric community forum.
At the moment, Fabric pipelines don’t support dynamically switching the source system of a Copy activity at runtime based on a flag. While parameters exist, using them to change sources like this isn’t a supported or reliable pattern.
The real issue is that both Dev and Prod are reading from the source systems. In Fabric, the recommended approach is to read from the source only once, usually in Prod, land the data into Raw/Staging, and then let both Dev and Prod pipelines read from that staged data.
If Dev needs to test transformations, it should use the data already available in Raw/Staging (or a subset of it), instead of reading from the source again. This avoids double reads, reduces load on source systems, and keeps the setup simpler and more stable.
Best Regards,
Community Support Team