Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
Hi,
We currently have around 250 dashboards configured with one of two developer accounts in the semantic model connection.
I want to change this to a service principle so that when staff leave and their account gets disabled, we dont have to re-authenticate all the reports with a new account.
We use databricks and have dev, test, prod environments.
The databricks connection settings are stored within a keyvault. The service principle account can access the vault.
I've been testing with fabric piplines and I have created one that returns a chosen secret from one of three vaults (dev,test,prod).
e.g. I can pass 'DatabricksDevopsClientSecret' and 'Dev' it will return the corresponding secret from the dev keyvault.
I then want to dynamically use this in my development pipelines, but this is where im getting stuck.
Can anyone point me in the right direction?
thank you
Solved! Go to Solution.
The missing piece, I believe is that Fabric pipelines cannot directly “flow” a returned secret into downstream activity credentials the way you expect. The supported pattern is: resolve the secret at runtime from Azure Key Vault, store it in a pipeline variable or parameter, then pass that value into activities that support expression-based auth fields (for example Notebook, Web, or Databricks activities). For databricks connections, instead of embedding secrets in the semantic model or connection UI, parameterise the workspace URL, client id, and secret, and bind them through pipeline parameters that read from KeyVault using the managed identity or service principal.
In practice this means: use one central pipeline to fetch the secret -> call child pipelines with that value as a parameter -> inject it into activity auth fields via expressions. That is the only scalable way to rotate credentials across environments without re-publishing 250 models.
Hi nick9one1,
We are following up to see if what we shared solved your issue. If you need more support, please reach out to the Microsoft Fabric community.
Thank you.
Thankyou, @Vinodh247for your response.
Hi nick9one1,
We appreciate your inquiry through the Microsoft Fabric Community Forum.
We would like to inquire whether have you got the chance to check the solution provided by @Vinodh247 to resolve the issue. We hope the information provided helps to clear the query. Should you have any further queries, kindly feel free to contact the Microsoft Fabric community.
Thank you.
The missing piece, I believe is that Fabric pipelines cannot directly “flow” a returned secret into downstream activity credentials the way you expect. The supported pattern is: resolve the secret at runtime from Azure Key Vault, store it in a pipeline variable or parameter, then pass that value into activities that support expression-based auth fields (for example Notebook, Web, or Databricks activities). For databricks connections, instead of embedding secrets in the semantic model or connection UI, parameterise the workspace URL, client id, and secret, and bind them through pipeline parameters that read from KeyVault using the managed identity or service principal.
In practice this means: use one central pipeline to fetch the secret -> call child pipelines with that value as a parameter -> inject it into activity auth fields via expressions. That is the only scalable way to rotate credentials across environments without re-publishing 250 models.
The missing piece, I believe is that Fabric pipelines cannot directly “flow” a returned secret into downstream activity credentials the way you expect. The supported pattern is: resolve the secret at runtime from Azure Key Vault, store it in a pipeline variable or parameter, then pass that value into activities that support expression-based auth fields (for example Notebook, Web, or Databricks activities). For databricks connections, instead of embedding secrets in the semantic model or connection UI, parameterise the workspace URL, client id, and secret, and bind them through pipeline parameters that read from KeyVault using the managed identity or service principal.
In practice this means: use one central pipeline to fetch the secret -> call child pipelines with that value as a parameter -> inject it into activity auth fields via expressions. That is the only scalable way to rotate credentials across environments without re-publishing 250 models.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.