Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello,
We are using deployment pipelines in Fabric with two environments (DEV and PROD).
For notebooks, pipelines, and semantic models, deployment rules make it easy to map resources between environments. However, Dataflows Gen2 are not supported in these rules.
After deployment, the PROD dataflow still writes to the DEV Lakehouse, and we need to manually update the destination in PROD.
👉 Is there any solution or mechanism to automate this modification during deployment (for example using parameters, API, or another approach)?
Thank you very much for your help!
Solved! Go to Solution.
Thankyou, @blopez11, for your response.
Hi mustaphaben,
Thank you for reaching out via the Microsoft Fabric Community Forum.
Based on my understanding, Dataflow Gen2 destinations such as Lakehouse or workspace IDs are currently stored within the dataflow definition (mashup.pq) and are not supported by deployment rules. Therefore, after deploying from DEV to PROD, the dataflow continues to point to the DEV Lakehouse unless the destination is updated manually.
Please consider the following approaches that may help resolve the issue:
Please refer to the links below for additional information:
Use public parameters in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Use Fabric variable libraries in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Items - Get Dataflow Definition - REST API (Dataflow) | Microsoft Learn
Items - Update Dataflow Definition - REST API (Dataflow) | Microsoft Learn
Dataflow definition - Microsoft Fabric REST APIs | Microsoft Learn
We hope the above information helps to resolve the issue. If you have any further queries, please feel free to contact the Microsoft Fabric Community.
Thank you.
Hi mustaphaben,
Thank you for the follow up.
Based on my understanding, since Fabric deployment pipelines currently do not automatically switch Dataflow Gen2 destinations from the DEV Lakehouse to the PROD Lakehouse, full automation in Azure DevOps can be achieved by using one of the following methods:
Both approaches eliminate the need for manual edits and enable integration of Fabric deployment into a standard CI/CD process.
For further reference, please see the following links:
fabric-cli
GitHub - microsoft/fabric-cicd: Jumpstart CICD deployments in Microsoft Fabric
Terraform Provider for Microsoft Fabric (Generally Available) | Microsoft Fabric Blog | Microsoft Fa...
We hope that the information provided will help to resolve the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric community.
Thank you.
Hi mustaphaben,
Thank you for the follow up.
Based on my understanding, since Fabric deployment pipelines currently do not automatically switch Dataflow Gen2 destinations from the DEV Lakehouse to the PROD Lakehouse, full automation in Azure DevOps can be achieved by using one of the following methods:
Both approaches eliminate the need for manual edits and enable integration of Fabric deployment into a standard CI/CD process.
For further reference, please see the following links:
fabric-cli
GitHub - microsoft/fabric-cicd: Jumpstart CICD deployments in Microsoft Fabric
Terraform Provider for Microsoft Fabric (Generally Available) | Microsoft Fabric Blog | Microsoft Fa...
We hope that the information provided will help to resolve the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric community.
Thank you.
Hi mustaphaben,
We would like to follow up and see whether the details we shared have resolved your problem.
If you need any more assistance, please feel free to connect with the Microsoft Fabric community.
Thank you.
Hi, We are currently modifying the workspaces manually. We are looking for a CI/CD solution with Azure DevOps. If deployment pipelines cannot address this need, is there an alternative you would recommend? Thank you.
Hi @mustaphaben,
We have faced the exact same challenge when moving from DEV → PROD using Deployment Pipelines with Dataflows Gen2.
Here’s what we found:
Deployment rules today support notebooks, pipelines, semantic models, and connections — but not Dataflows Gen2.
The source part can be parameterized in Power Query M (using query parameters), which lets you switch between DEV/PROD data sources automatically.
The destination (sink) of a Dataflow Gen2 is not currently configurable via deployment rules, so after deployment it still writes to the DEV Lakehouse.
We manually update the destination Lakehouse in PROD after each deployment,
or
We keep Dataflows Gen2 in a separate workspace that is not included in the deployment pipeline, to avoid overwriting PROD configuration with DEV configuration.
Hope it will be added in the next updates, since it’s a common need when using Fabric at scale with multiple environments.
Best regards,
Antoine
Thankyou, @blopez11, for your response.
Hi mustaphaben,
Thank you for reaching out via the Microsoft Fabric Community Forum.
Based on my understanding, Dataflow Gen2 destinations such as Lakehouse or workspace IDs are currently stored within the dataflow definition (mashup.pq) and are not supported by deployment rules. Therefore, after deploying from DEV to PROD, the dataflow continues to point to the DEV Lakehouse unless the destination is updated manually.
Please consider the following approaches that may help resolve the issue:
Please refer to the links below for additional information:
Use public parameters in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Use Fabric variable libraries in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Items - Get Dataflow Definition - REST API (Dataflow) | Microsoft Learn
Items - Update Dataflow Definition - REST API (Dataflow) | Microsoft Learn
Dataflow definition - Microsoft Fabric REST APIs | Microsoft Learn
We hope the above information helps to resolve the issue. If you have any further queries, please feel free to contact the Microsoft Fabric Community.
Thank you.
Yes, this is, IMO, the last remaining pain point in CI/CD support for dataflows gen2 (CI/CD). I would be interested if there is an easy way to do this as well as I am not aware of any at this point.
May want to take a look at this thread:
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!