Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
ChrisPerrin
New Member

Databricks and Deployment Pipelines

We have a report Power BI Report that uses a DataBricks dataset.  We also use Pipelines to Publish reports from Dev->Staging->Prod.

 

Our problem ia changing the dataset or a connection in the dataset for each new environment.  The problem is that we can use parameters in a DataBricks connection string and we don't want to copy a dataset for each environment.

 

Has anyone run across this?

 

Note: we are using the Azure Databricks Connector.

2 REPLIES 2
Nimrod_Shalit
Power BI Team
Power BI Team

You can manage the dataset outside the pipeline, and only keep managing the reports in it. 

This way, when you copy the report across stages, it will keep the original connection to the same dataset.

You will then have 3 instances of the report, all connected to 1 dataset in a different workspace.

lbendlin
Super User
Super User

you can point all pipeline workspaces to the same data source.  Defeats the purpose a little bit, but you gotta do what you gotta do.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.