The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi Developers,
I’m working with multiple databases—Dev (Dev_WH), QA(QA_WH), and Prod(Prod_WH)—that all contain the same datasets.
Currently, when I deploy a Power BI report from one environment to another (e.g., from Dev to QA to Prod), I have to manually change the data source in Power BI Desktop, republish the report, and then update the data source rules in the Power BI Service for each environment.
Is there a way to automate this data source switching process during deployment without using Dataflows?
If there are any video tutorials or step-by-step guides on how to achieve this setup, I’d greatly appreciate your recommendations.
Thank you!
Solved! Go to Solution.
Yes, you can automate data source switching across your Dev, QA, and Prod environments in Power BI without relying on Dataflows. This can be achieved by utilizing Deployment Pipelines in conjunction with Parameters or Data Source Rules, depending on your specific setup.
This method is versatile and works across various data sources, including those not natively supported by Deployment Pipelines.
Source = Sql.Database(ServerName, DatabaseName)
Note: After each deployment, ensure that the dataset is refreshed to apply the new parameter values.
If your data sources are among those supported by Deployment Pipelines (e.g., Azure SQL Database, SQL Server, Oracle), you can use Data Source Rules:
Note: Data Source Rules require that the data source types remain consistent across environments.
For a visual walkthrough, refer to the following tutorial:
Update Data Sources within Deployment Pipelines
Implementing these strategies will streamline your deployment process and reduce manual intervention. If you need assistance with setting up parameters or configuring deployment pipelines, feel free to ask!
✔️ If my message helped solve your issue, please mark it as Resolved! 👍 If it was helpful, consider giving it a Kudos! |
Hi @Jozuna
Thanks for your post.
When working with deployment pipelines across Dev, QA, and Prod environments, the recommended approach to handle automatic data source switching is by using parameters, combined with deployment rules.
Overview of the setup:
Define Parameters in Power BI Desktop
Create parameters for server and database names (e.g. ServerName, DatabaseName) and use them in your data source connection.
Publish to the Deployment Pipeline (Dev Stage)
Once published to a pipeline workspace, this allows rule management between stages.
Configure Deployment Rules in Power BI Service
Open the deployment pipeline, select the gear icon between stages, and add Data Source Rules or Parameter Rules to change the values used in each environment.
Additional resources you might find helpful:
Create deployment rules for Fabric's ALM - Microsoft Fabric | Microsoft Learn
Solved: Deployment pipeline based parameters for datasets ... - Microsoft Fabric Community
Power BI — deployment pipelines — auto-binding and deployment rules | by Michal Molka | Medium
Changing the source Lakehouse of Power BI Direct Lake models in deployment pipelines
Update Data Sources within Deployment Pipelines - YouTube
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.
Thank you.
Hi @SolomonovAnton, @v-veshwara-msft
My report is in Trial workspace and report is in an Import mode
The thing is: for Dev I use Fabric_DEV_WH DB, QA it's Fabric_QA_WH, in Prod it's Fabric_PROD_WH
let
Source = Sql.Database(Server_Name, DB_Name)
in
Source
Still not working if I try to deploy it to QA getting the following error
Dev to QA Deployment error:
Can't start the deployment
When deploying the items below, any related items must be included in the deployment or must already exist in the target folder.
Please check the technical details for more information. If you contact support, please provide these details.
Why it's happening, how to resolve it?
Hi @Jozuna
Thanks for your post.
When working with deployment pipelines across Dev, QA, and Prod environments, the recommended approach to handle automatic data source switching is by using parameters, combined with deployment rules.
Overview of the setup:
Define Parameters in Power BI Desktop
Create parameters for server and database names (e.g. ServerName, DatabaseName) and use them in your data source connection.
Publish to the Deployment Pipeline (Dev Stage)
Once published to a pipeline workspace, this allows rule management between stages.
Configure Deployment Rules in Power BI Service
Open the deployment pipeline, select the gear icon between stages, and add Data Source Rules or Parameter Rules to change the values used in each environment.
Additional resources you might find helpful:
Create deployment rules for Fabric's ALM - Microsoft Fabric | Microsoft Learn
Solved: Deployment pipeline based parameters for datasets ... - Microsoft Fabric Community
Power BI — deployment pipelines — auto-binding and deployment rules | by Michal Molka | Medium
Changing the source Lakehouse of Power BI Direct Lake models in deployment pipelines
Update Data Sources within Deployment Pipelines - YouTube
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.
Thank you.
Yes, you can automate data source switching across your Dev, QA, and Prod environments in Power BI without relying on Dataflows. This can be achieved by utilizing Deployment Pipelines in conjunction with Parameters or Data Source Rules, depending on your specific setup.
This method is versatile and works across various data sources, including those not natively supported by Deployment Pipelines.
Source = Sql.Database(ServerName, DatabaseName)
Note: After each deployment, ensure that the dataset is refreshed to apply the new parameter values.
If your data sources are among those supported by Deployment Pipelines (e.g., Azure SQL Database, SQL Server, Oracle), you can use Data Source Rules:
Note: Data Source Rules require that the data source types remain consistent across environments.
For a visual walkthrough, refer to the following tutorial:
Update Data Sources within Deployment Pipelines
Implementing these strategies will streamline your deployment process and reduce manual intervention. If you need assistance with setting up parameters or configuring deployment pipelines, feel free to ask!
✔️ If my message helped solve your issue, please mark it as Resolved! 👍 If it was helpful, consider giving it a Kudos! |