Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Pipeline: parameterize connection in all activity types

 

The current inability to parameterize connections in pipeline activities means we need to use the same identity to run the pipeline activities across dev/test/prod environments.

 

This means the same identity needs to have write access to all environments dev/test/prod.

 

This creates a risk that code executed in dev writes data to prod, because the identity has write access to all environments.

 

Please make it possible to parameterize the connection of all pipeline activity types, so we can isolate the identities for dev/test/prod and make it physically impossible for a dev pipeline activity to write data to prod environment.

 

Here's an overview based on my trials and errors:

 

Activities that do have "Use dynamic content" option in connection:

 

- Copy activity 

- Stored procedure 

- Lookup

- Get metadata

- Script

- Delete data 

- KQL

 

Activities that do not have "Use dynamic content" option in connection:

 

- Semantic model refresh activity 

- Copy job

- Invoke pipeline

- Web

- Azure Databricks 

- WebHook

- Functions

- Azure HDInsight 

- Azure Batch 

- Azure Machine Learning

Status: New