Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
I have a Dataflow Gen2 in my dev workspace, and have created a deployment pipeline to copy this into my test and prod workspaces.
The pipelines in the deployed workspaces output to lakehouse tables in my dev workspace.
Is there a way to update the deployed Dataflows in bulk so that they output to their local lakehouses?
Solved! Go to Solution.
Hi @dolphinantonym,
When deploying Dataflows Gen2 using deployment pipelines, the dataflows keep their original connection and destination settings. As a result, even after moving from Dev to Test or Prod, they continue to write to the Lakehouse in the Dev workspace. Currently, Fabric does not provide an automated way to update all dataflow outputs in bulk during deployment.
To handle this, you can use Variable Libraries to maintain environment-specific configurations. Each stage in the pipeline (Dev, Test, Prod) can have its own active value set in the same variable library. These values let you reference environment-specific parameters—like workspace IDs, Lakehouse names, or table identifiers—without manual changes after each deployment. During runtime, the dataflow uses the active value set for the environment, ensuring the correct configuration is applied.
If you use Git-based source control, Variable Libraries are Git-enabled and versioned with your dataflows. When you create a workspace from a feature branch, you can select the active value set for that branch or environment, allowing flexible testing and management of configurations.
In the dataflow’s mashup.pq file, you can reference variables using Power Query functions like Variable.Value() or Variable.ValueOrDefault(). For example, Variable.Value("$(/**/My Library/Workspace ID)") gets the workspace ID dynamically. However, Variable Libraries can only control parameters used inside the query logic and cannot rebind the output sink set in the dataflow UI.
For larger environments with many dataflows, you can use Fabric REST APIs to automate rebinding. A script can loop through dataflows in the target workspace and update their destination properties, providing scalable automation to ensure deployed dataflows write to the correct Lakehouses after deployment.
Lifecycle Management of the Microsoft Fabric Variable library - Microsoft Fabric | Microsoft Learn
Use Fabric variable libraries in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Thank you.
Hi @dolphinantonym,
Just wanted to check regarding your question. We haven’t heard back and want to ensure you're not stuck. If you need anything else or have updates to share, we’re here to help!
Let us know if you need any additional support.
Thank you.
Hi @dolphinantonym,
We wanted to follow up to see if our suggestion was helpful. Please let us know how things are progressing and if you are still encountering any issues.
Thank you.
Hi @dolphinantonym,
As we did not get a response, may I know if the above reply could clarify your issue, or could you please help confirm if we may help you with anything else?
Your understanding and patience will be appreciated.
Hi @dolphinantonym,
When deploying Dataflows Gen2 from Dev to Test or Prod using deployment pipelines, the dataflows retain their original connections and destinations. As a result, they continue to write data to the Lakehouse in your Dev workspace after deployment.
Currently, Fabric does not offer a built-in feature to automatically update all Dataflow outputs in bulk during deployment. The deployment pipeline simply copies the dataflow without modifying the destination path.
If you have only a few dataflows, you can manually update the data destination in each dataflow within the Test or Prod workspace to point to the local Lakehouse. While this works for a small number, it is not efficient for many dataflows.
For larger deployments, consider using parameters or variable libraries. By defining variables such as the Lakehouse or table name with environment-specific values, you can ensure dataflows write to the correct Lakehouse after deployment.
If you already have many dataflows, the Fabric REST API allows you to update them in bulk. You can use a script to loop through all dataflows in your Test or Prod workspace and change their sink settings to the local Lakehouse, making this process much faster and suitable for automation after deployment.
CI/CD and ALM solution architectures for Dataflow Gen2 - Microsoft Fabric | Microsoft Learn
Dataflow Gen2 data destinations and managed settings - Microsoft Fabric | Microsoft Learn
Overview of Fabric deployment pipelines - Microsoft Fabric | Microsoft Learn
Thank you.
By defining variables such as the Lakehouse or table name with environment-specific values, you can ensure dataflows write to the correct Lakehouse after deployment.
By "environment-specific", do you mean using the deployment pipeline to set the variables correctly for test and prod environments as part of deployment?
If so, is there a way to do the same thing for feature branches that are created from source control rather than via deployment pipelines?
Also, do you know where to insert the variable to the dataflow? I can't find where it is editable, and the only place I can see the output Workspace being referenced is in a file called "mashup.pq" when I look at my Dataflow in GitHub, but I can't see where to find/edit it from within Fabric.
Hi @dolphinantonym,
When deploying Dataflows Gen2 using deployment pipelines, the dataflows keep their original connection and destination settings. As a result, even after moving from Dev to Test or Prod, they continue to write to the Lakehouse in the Dev workspace. Currently, Fabric does not provide an automated way to update all dataflow outputs in bulk during deployment.
To handle this, you can use Variable Libraries to maintain environment-specific configurations. Each stage in the pipeline (Dev, Test, Prod) can have its own active value set in the same variable library. These values let you reference environment-specific parameters—like workspace IDs, Lakehouse names, or table identifiers—without manual changes after each deployment. During runtime, the dataflow uses the active value set for the environment, ensuring the correct configuration is applied.
If you use Git-based source control, Variable Libraries are Git-enabled and versioned with your dataflows. When you create a workspace from a feature branch, you can select the active value set for that branch or environment, allowing flexible testing and management of configurations.
In the dataflow’s mashup.pq file, you can reference variables using Power Query functions like Variable.Value() or Variable.ValueOrDefault(). For example, Variable.Value("$(/**/My Library/Workspace ID)") gets the workspace ID dynamically. However, Variable Libraries can only control parameters used inside the query logic and cannot rebind the output sink set in the dataflow UI.
For larger environments with many dataflows, you can use Fabric REST APIs to automate rebinding. A script can loop through dataflows in the target workspace and update their destination properties, providing scalable automation to ensure deployed dataflows write to the correct Lakehouses after deployment.
Lifecycle Management of the Microsoft Fabric Variable library - Microsoft Fabric | Microsoft Learn
Use Fabric variable libraries in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn
Thank you.
Right now, deployment pipelines don’t automatically rebind Dataflows Gen2 outputs. When you deploy from Dev to Test/Prod, the dataflows still point to the original Dev lakehouse.
There isn’t a bulk update option yet. The workarounds are:
1. Manually update the output lakehouse in each deployed dataflow
2. Or use the Dataflow API / JSON export to script a find-and-replace of the lakehouse/workspace IDs before importing into Test/Prod
So for now it’s either manual changes or some custom automation. Hopefully Microsoft will add support for re-binding dataflow outputs in deployment pipelines in the future.
Hi @dolphinantonym,
Deployment rules don't currently support dataflow gen 2: Create deployment rules for Fabric's ALM - Microsoft Fabric | Microsoft Learn
You might be able to use variable libraries to do this however:
Get Started with Variable Libraries - Microsoft Fabric | Microsoft Learn
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
 
					
				
				
			
		
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.
