The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
We have a report Power BI Report that uses a DataBricks dataset. We also use Pipelines to Publish reports from Dev->Staging->Prod.
Our problem ia changing the dataset or a connection in the dataset for each new environment. The problem is that we can use parameters in a DataBricks connection string and we don't want to copy a dataset for each environment.
Has anyone run across this?
Note: we are using the Azure Databricks Connector.
You can manage the dataset outside the pipeline, and only keep managing the reports in it.
This way, when you copy the report across stages, it will keep the original connection to the same dataset.
You will then have 3 instances of the report, all connected to 1 dataset in a different workspace.
you can point all pipeline workspaces to the same data source. Defeats the purpose a little bit, but you gotta do what you gotta do.
User | Count |
---|---|
36 | |
15 | |
11 | |
11 | |
9 |
User | Count |
---|---|
46 | |
44 | |
20 | |
18 | |
18 |