The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Good afternoon,
I have a set of workspaces that I created for the purpose of housing my Dataflows; one for dev, test, and prod. Is there any reason why I would want to set up the dataflows into a Deployment Pipeline? I wouldn't want to overwrite the Dataflows in Deployment Pipelines, so what is the purpose?
For example: my Dev database is pointed to my Test and Dev workspaces, and my Prod database is pointed to my Prod workspace. Therefore if the workspaces were in a deployment pipeline, wouldn't my Prod workspace just get overwriten with the Dev configuration?
Solved! Go to Solution.
In a deployment pipeline, you can set rules for the data source and parameters for each dataset. So when you deploy from test to pre-prod to prod, you can change the connection string for the database connection.
In a deployment pipeline, you can set rules for the data source and parameters for each dataset. So when you deploy from test to pre-prod to prod, you can change the connection string for the database connection.
How would you actually go about doing this? When I look in the deployment rules, the "Data Source Rules" are greyed out. So its a bit confusing to me. It says I can go to "artifact settings" to change the data source settings, but when I go there, I don't see an option to change the actual data source.
Thank you, I'm going through that process and I do see those options. Looks like its asking me to modify the settings from the artifact settings. Will go through this process and see how it works. thanks!
User | Count |
---|---|
43 | |
15 | |
13 | |
13 | |
9 |
User | Count |
---|---|
50 | |
43 | |
24 | |
22 | |
18 |