Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now

Reply
nick9one1
Helper III
Helper III

dynamically getting keyvault secret in production pipeline

Hi,

We currently have around 250 dashboards configured with one of two developer accounts in the semantic model connection.

I want to change this to a service principle so that when staff leave and their account gets disabled, we dont have to re-authenticate all the reports with a new account.

We use databricks and have dev, test, prod environments. 
The databricks connection settings are stored within a keyvault. The service principle account can access the vault. 

I've been testing with fabric piplines and I have created one that returns a chosen secret from one of three vaults (dev,test,prod). 

e.g. I can pass 'DatabricksDevopsClientSecret' and 'Dev' it will return the corresponding secret from the dev keyvault. 

I then want to dynamically use this in my development pipelines, but this is where im getting stuck. 

Can anyone point me in the right direction?

nick9one1_0-1771265722881.png

 

thank you

1 ACCEPTED SOLUTION
Vinodh247
Super User
Super User

The missing piece, I believe is that Fabric pipelines cannot directly “flow” a returned secret into downstream activity credentials the way you expect. The supported pattern is: resolve the secret at runtime from Azure Key Vault, store it in a pipeline variable or parameter, then pass that value into activities that support expression-based auth fields (for example Notebook, Web, or Databricks activities). For databricks connections, instead of embedding secrets in the semantic model or connection UI, parameterise the workspace URL, client id, and secret, and bind them through pipeline parameters that read from KeyVault using the managed identity or service principal.

 

In practice this means: use one central pipeline to fetch the secret -> call child pipelines with that value as a parameter -> inject it into activity auth fields via expressions. That is the only scalable way to rotate credentials across environments without re-publishing 250 models.

 

Please 'Kudos' and 'Accept as Solution' if this answered your query.

Regards,
Vinodh
Microsoft MVP [Fabric]
LI: https://www.linkedin.com/in/vinodh-kumar-173582132
Blog: vinsdata.in

View solution in original post

4 REPLIES 4
v-pnaroju-msft
Community Support
Community Support

Hi nick9one1,

We are following up to see if what we shared solved your issue. If you need more support, please reach out to the Microsoft Fabric community.

Thank you.

v-pnaroju-msft
Community Support
Community Support

Thankyou, @Vinodh247for your response.

Hi nick9one1,

We appreciate your inquiry through the Microsoft Fabric Community Forum.

We would like to inquire whether have you got the chance to check the solution provided by @Vinodh247  to resolve the issue. We hope the information provided helps to clear the query. Should you have any further queries, kindly feel free to contact the Microsoft Fabric community.

Thank you.

Vinodh247
Super User
Super User

The missing piece, I believe is that Fabric pipelines cannot directly “flow” a returned secret into downstream activity credentials the way you expect. The supported pattern is: resolve the secret at runtime from Azure Key Vault, store it in a pipeline variable or parameter, then pass that value into activities that support expression-based auth fields (for example Notebook, Web, or Databricks activities). For databricks connections, instead of embedding secrets in the semantic model or connection UI, parameterise the workspace URL, client id, and secret, and bind them through pipeline parameters that read from KeyVault using the managed identity or service principal.

 

In practice this means: use one central pipeline to fetch the secret -> call child pipelines with that value as a parameter -> inject it into activity auth fields via expressions. That is the only scalable way to rotate credentials across environments without re-publishing 250 models.

 

Please 'Kudos' and 'Accept as Solution' if this answered your query.

Regards,
Vinodh
Microsoft MVP [Fabric]
LI: https://www.linkedin.com/in/vinodh-kumar-173582132
Blog: vinsdata.in
Vinodh247
Super User
Super User

The missing piece, I believe is that Fabric pipelines cannot directly “flow” a returned secret into downstream activity credentials the way you expect. The supported pattern is: resolve the secret at runtime from Azure Key Vault, store it in a pipeline variable or parameter, then pass that value into activities that support expression-based auth fields (for example Notebook, Web, or Databricks activities). For databricks connections, instead of embedding secrets in the semantic model or connection UI, parameterise the workspace URL, client id, and secret, and bind them through pipeline parameters that read from KeyVault using the managed identity or service principal.

 

In practice this means: use one central pipeline to fetch the secret -> call child pipelines with that value as a parameter -> inject it into activity auth fields via expressions. That is the only scalable way to rotate credentials across environments without re-publishing 250 models.

 

Please 'Kudos' and 'Accept as Solution' if this answered your query.

Regards,
Vinodh
Microsoft MVP [Fabric]
LI: https://www.linkedin.com/in/vinodh-kumar-173582132
Blog: vinsdata.in

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

February Power BI Update Carousel

Power BI Monthly Update - February 2026

Check out the February 2026 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.