Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
mustaphaben
Frequent Visitor

Automating Dataflow Gen2 destination changes in deployment pipelines

Hello,

 

We are using deployment pipelines in Fabric with two environments (DEV and PROD).

For notebooks, pipelines, and semantic models, deployment rules make it easy to map resources between environments. However, Dataflows Gen2 are not supported in these rules.

After deployment, the PROD dataflow still writes to the DEV Lakehouse, and we need to manually update the destination in PROD.

 

👉 Is there any solution or mechanism to automate this modification during deployment (for example using parameters, API, or another approach)?

 

Thank you very much for your help!

2 ACCEPTED SOLUTIONS
v-pnaroju-msft
Community Support
Community Support

Thankyou, @blopez11, for your response.

Hi mustaphaben,

Thank you for reaching out via the Microsoft Fabric Community Forum.

Based on my understanding, Dataflow Gen2 destinations such as Lakehouse or workspace IDs are currently stored within the dataflow definition (mashup.pq) and are not supported by deployment rules. Therefore, after deploying from DEV to PROD, the dataflow continues to point to the DEV Lakehouse unless the destination is updated manually.

Please consider the following approaches that may help resolve the issue:

  1. Store the target workspaceId and lakehouseId in a Variable Library and reference these variables in the dataflow using Variable.ValueOrDefault("$(/**/LibraryName/VariableName)"). Assign different values for DEV and PROD. During deployment, the correct Lakehouse will be applied automatically.
  2. After deployment, call the Get Definition API to download the dataflow definition. Replace the DEV workspaceId or lakehouseId with the PROD values, and then push the changes back using the Update Definition API.

Please refer to the links below for additional information:
Use public parameters in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Use Fabric variable libraries in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Items - Get Dataflow Definition - REST API (Dataflow) | Microsoft Learn
Items - Update Dataflow Definition - REST API (Dataflow) | Microsoft Learn
Dataflow definition - Microsoft Fabric REST APIs | Microsoft Learn

We hope the above information helps to resolve the issue. If you have any further queries, please feel free to contact the Microsoft Fabric Community.

Thank you.

View solution in original post

v-pnaroju-msft
Community Support
Community Support

Hi mustaphaben,

Thank you for the follow up.

Based on my understanding, since Fabric deployment pipelines currently do not automatically switch Dataflow Gen2 destinations from the DEV Lakehouse to the PROD Lakehouse, full automation in Azure DevOps can be achieved by using one of the following methods:

  1. Install the Fabric CLI in the Azure DevOps pipeline and use the Fabric CI/CD toolkit to publish items from the Git repository to the target workspace, with parameter substitution for PROD values.
  2. Use the Microsoft Fabric Terraform provider in the pipeline to declaratively provision and manage workspaces, dataflows, and destinations. Execute terraform plan and terraform apply as part of the release so that PROD dataflows automatically point to the correct Lakehouse.

Both approaches eliminate the need for manual edits and enable integration of Fabric deployment into a standard CI/CD process.

For further reference, please see the following links:
fabric-cli
GitHub - microsoft/fabric-cicd: Jumpstart CICD deployments in Microsoft Fabric
Terraform Provider for Microsoft Fabric (Generally Available) | Microsoft Fabric Blog | Microsoft Fa...

We hope that the information provided will help to resolve the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric community.

Thank you.

View solution in original post

7 REPLIES 7
v-pnaroju-msft
Community Support
Community Support

Hi mustaphaben,

Thank you for the follow up.

Based on my understanding, since Fabric deployment pipelines currently do not automatically switch Dataflow Gen2 destinations from the DEV Lakehouse to the PROD Lakehouse, full automation in Azure DevOps can be achieved by using one of the following methods:

  1. Install the Fabric CLI in the Azure DevOps pipeline and use the Fabric CI/CD toolkit to publish items from the Git repository to the target workspace, with parameter substitution for PROD values.
  2. Use the Microsoft Fabric Terraform provider in the pipeline to declaratively provision and manage workspaces, dataflows, and destinations. Execute terraform plan and terraform apply as part of the release so that PROD dataflows automatically point to the correct Lakehouse.

Both approaches eliminate the need for manual edits and enable integration of Fabric deployment into a standard CI/CD process.

For further reference, please see the following links:
fabric-cli
GitHub - microsoft/fabric-cicd: Jumpstart CICD deployments in Microsoft Fabric
Terraform Provider for Microsoft Fabric (Generally Available) | Microsoft Fabric Blog | Microsoft Fa...

We hope that the information provided will help to resolve the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric community.

Thank you.

v-pnaroju-msft
Community Support
Community Support

Hi mustaphaben,

We would like to follow up and see whether the details we shared have resolved your problem.
If you need any more assistance, please feel free to connect with the Microsoft Fabric community.

Thank you.

Hi, We are currently modifying the workspaces manually. We are looking for a CI/CD solution with Azure DevOps. If deployment pipelines cannot address this need, is there an alternative you would recommend? Thank you.

AntoineW
Memorable Member
Memorable Member

Hi @mustaphaben

 

We have faced the exact same challenge when moving from DEV → PROD using Deployment Pipelines with Dataflows Gen2.

Here’s what we found:

  • Deployment rules today support notebooks, pipelines, semantic models, and connections — but not Dataflows Gen2.

  • The source part can be parameterized in Power Query M (using query parameters), which lets you switch between DEV/PROD data sources automatically.

  • The destination (sink) of a Dataflow Gen2 is not currently configurable via deployment rules, so after deployment it still writes to the DEV Lakehouse.

What we did as a workaround:

  • We manually update the destination Lakehouse in PROD after each deployment,
    or

  • We keep Dataflows Gen2 in a separate workspace that is not included in the deployment pipeline, to avoid overwriting PROD configuration with DEV configuration.

 

Hope it will be added in the next updates, since it’s a common need when using Fabric at scale with multiple environments.

 

Best regards,

Antoine

v-pnaroju-msft
Community Support
Community Support

Thankyou, @blopez11, for your response.

Hi mustaphaben,

Thank you for reaching out via the Microsoft Fabric Community Forum.

Based on my understanding, Dataflow Gen2 destinations such as Lakehouse or workspace IDs are currently stored within the dataflow definition (mashup.pq) and are not supported by deployment rules. Therefore, after deploying from DEV to PROD, the dataflow continues to point to the DEV Lakehouse unless the destination is updated manually.

Please consider the following approaches that may help resolve the issue:

  1. Store the target workspaceId and lakehouseId in a Variable Library and reference these variables in the dataflow using Variable.ValueOrDefault("$(/**/LibraryName/VariableName)"). Assign different values for DEV and PROD. During deployment, the correct Lakehouse will be applied automatically.
  2. After deployment, call the Get Definition API to download the dataflow definition. Replace the DEV workspaceId or lakehouseId with the PROD values, and then push the changes back using the Update Definition API.

Please refer to the links below for additional information:
Use public parameters in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Use Fabric variable libraries in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Items - Get Dataflow Definition - REST API (Dataflow) | Microsoft Learn
Items - Update Dataflow Definition - REST API (Dataflow) | Microsoft Learn
Dataflow definition - Microsoft Fabric REST APIs | Microsoft Learn

We hope the above information helps to resolve the issue. If you have any further queries, please feel free to contact the Microsoft Fabric Community.

Thank you.

blopez11
Super User
Super User

Yes, this is, IMO, the last remaining pain point in CI/CD support for dataflows gen2 (CI/CD).  I would be interested if there is an easy way to do this as well as I am not aware of any at this point.

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.