Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
The current inability to parameterize connections in pipeline activities means we need to use the same identity to run the pipeline activities across dev/test/prod environments.
This means the same identity needs to have write access to all environments dev/test/prod.
This creates a risk that code executed in dev writes data to prod, because the identity has write access to all environments.
Please make it possible to parameterize the connection of all pipeline activity types, so we can isolate the identities for dev/test/prod and make it physically impossible for a dev pipeline activity to write data to prod environment.
Here's an overview based on my trials and errors:
Activities that do have "Use dynamic content" option in connection:
- Copy activity
- Stored procedure
- Lookup
- Get metadata
- Script
- Delete data
- KQL
Activities that do not have "Use dynamic content" option in connection:
- Semantic model refresh activity
- Copy job
- Invoke pipeline
- Web
- Azure Databricks
- WebHook
- Functions
- Azure HDInsight
- Azure Batch
- Azure Machine Learning
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.