Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
I got a dataflow that we have published in a workspace that is part of a deployment pipeline (DEV-QA-PROD).
I have defined parameters to allow changing the workspace Id and dataflow Id in Power BI Service and in the deployment pipeline rules.
The issue I have is that for a specific dataset, it does not follow the parameter's value, but strictly points to PROD.
So for example, I have this parameter for the workspace Id:
/// DEV: workspace_dev_id
/// QA: workspace_qa_id
/// PRD: workspace_prod_id
expression WS_WORKSPACE_NAME = "workspace-dev-id" meta [IsParameterQuery=true, List={"workspace-dev-id", "workspace-qa-id", "workspace-prod-id"}, DefaultValue=..., Type="Text", IsParameterQueryRequired=true]
And this parameter for the dataflow Id:
/// DEV: dataflow_dev_id
/// QA: dataflow_qa_id
/// PRD: dataflow_prod_id
expression DF_DATAFLOW_NAME = "dataflow-dev-id" meta [IsParameterQuery=true, List={"dataflow-dev-id", "dataflow-qa-id", "dataflow-prod-id"}, DefaultValue=..., Type="Text", IsParameterQueryRequired=true]
Now, when I want a query to use those parameters I would create the request using the "Get data" wizard and the replace the `workspaceid` and `dataflowid` values by the parameter names :
let
Source = PowerPlatform.Dataflows([]),
Workspaces = Source{[Id="Workspaces"]}[Data],
workspaceId = Workspaces{[workspaceId=WS_WORKSPACE_NAME]}[Data],
dataflowId = workspaceId{[dataflowId=DF_DATAFLOW_NAME]}[Data],
entity = dataflowId{[entity="Entity Name"]}[Data]
in
entity
I just realized that whatever value I choose in the list for any of the parameters it points to PROD values.
Even after it is deployed in Power BI Service it stays that way. I can change the dataset's parameter values but it keeps reading PROD.
I even tried with new parameters but it keeps going to PROD values.
Any ideas what I should look for?
*edit* : Th ebiggest issue is that even if the workspace parameter is set to connect to DEV, and the dataflow id is one from QA, it still returns data from DEV.
Hi, I'm having the issue in Power BI Desktop. Power BI Service is setup correctly for parameters override.
Hi @FireFighter1017 ,
Thanks for reaching out to the Microsoft fabric community forum.
This behavior is caused because Power BI deployment pipelines override dataset parameters automatically based on the rules defined in the deployment pipeline environment configuration.
So even if:
You change the parameter in Power BI Desktop,
Or you republish with DEV/QA values manually,
The deployment pipeline automatically replaces the parameter values with the values configured for the current environment (which for you appears to be PROD).
Do the following in Power BI Service:
Navigate to the deployment pipeline that contains your dataset.
Go to the middle environment (e.g., QA) or DEV.
In the Deployment Settings, look at the Parameter rules.
You will likely see the parameters WS_WORKSPACE_NAME and DF_DATAFLOW_NAME pointing to PROD values.
Change them to the appropriate DEV/QA values for that environment.
Save the rules.
If you do not want deployment pipelines to override your parameters:
In Deployment Settings - under each parameter, uncheck "Override during deployment".
However, it’s usually better to configure them properly per environment, not disable them.
To test locally in Power BI Desktop:
Set the parameter to a QA/DEV value.
Confirm that the query editor preview pulls data from QA/DEV.
Publish to a specific workspace (DEV or QA).
In Power BI Service, immediately go to dataset settings, and check the parameter values after publishing.
If part of a pipeline, make sure the pipeline isn’t overriding them on publish or deployment.
This will ensure that your workspace and dataflow IDs are correctly referenced per environment and that parameters behave as expected.
If the response has addressed your query, please Accept it as a solution and give a 'Kudos' so other members can easily find it
Best Regards,
Sreeteja.
Community Support Team
Also, DEV can't have deployment rules. Even though I change the parameter values in the Datset settings in Power BI Service for DEV, it still reads data from PROD. I also tested if the workspace parameter has QA's GUI and the dataflow has DEV GUI, it still returns data from PROD.
As I mentionned, the biggest issue is that we did your "Extra validation" and we have this behavior in Power BI Desktop.
Because we are using Git Integration with Github, we save our reports as PBIP. So when I open the report in Desktop mode, I do it from the local repository.
The report in Power BI Service DEV is updated through Git integration from Github.
Hi @FireFighter1017 ,
Thanks for reaching out to the Microsoft fabric community forum.
You're absolutely right this doesn’t look like a problem with parameter overrides in the Power BI Service or deployment rules. Instead, the real issue seems to be happening in Power BI Desktop, particularly when you're using the PBIP format with Git integration and connecting to data through PowerPlatform.Dataflows().
What's likely happening:
Even though your M code is correctly using parameters for the workspace and dataflow IDs, Power BI Desktop seems to cache the last successful connection made via the PowerPlatform.Dataflows() connector. This can cause unexpected behavior in PBIP projects, because:
The connections.json file might still store old PROD connection details, even after you’ve changed the parameters.
The PowerPlatform.Dataflows() connector doesn’t always re-check the parameters unless it’s explicitly forced to.
So, even if your query is fully dynamic, Power BI Desktop might silently keep using the old PROD data behind the scenes.
This caching behavior can make it look like your parameter changes aren’t working when in fact, Desktop is just clinging to what it connected to last.
Try These steps:
1. Clear Data Source Settings in Desktop:
- Go to File > Options and Settings > Data Source Settings.
- Clear permissions for Power Platform and remove cached connections.
2. Inspect PBIP Files in Git:
- Check parameters.json – confirm the right values are set.
- Check connections.json – this may have hardcoded workspace/dataflow IDs that override parameters.
3. Regenerate PBIP Cleanly:
- Try opening a clean .pbix (not from PBIP), and test parameters for DEV.
- Once confirmed working, export it as PBIP again.
- Compare the new connections.json with your repo version.
4. Consider Switching to OData.Feed():
Instead of using PowerPlatform.Dataflows(), try:
let
url = "https://api.powerbi.com/powerbi/globaldataflows/v1.0/myorg/groups/" & WS_WORKSPACE_NAME & "/dataflows/" & DF_DATAFLOW_NAME,
data = OData.Feed(url)
in
data
This gives full control over the connection URI and respects parameters reliably.
The issue is most likely caused by how PowerPlatform.Dataflows() behaves in PBIP projects it's not always reactive to parameter changes due to caching and metadata binding. Using OData.Feed() or cleaning out connections.json may help regain control.
If the response has addressed your query, please Accept it as a solution and give a 'Kudos' so other members can easily find it
Best Regards,
Sreeteja.
Community Support Team
hi @v-sshirivolu ,
I have good knowledge of how deployment rules can change dataset parameters in Power Bi Service.
My Issue is that even in Power BI Desktop, if using a workspace parameter set to workspace-prod-id and a dataflow parameter set to dataflow-dev-id it still works and will return whatever functionnal connection I had before. (ex.: workspace-prod-id , dataflow-prod-id )
User | Count |
---|---|
84 | |
79 | |
71 | |
48 | |
42 |
User | Count |
---|---|
111 | |
54 | |
50 | |
40 | |
40 |