Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.

Reply
YashnaChinta
New Member

Deployment using dataflow Gen 2

Hello,

I need some guidance regarding Fabric deployment pipelines. When deploying from development to production, how can we automate the configuration of the data destination for data flow gen 2, given that the lakehouses have different names?

5 REPLIES 5
AntoineW
Responsive Resident
Responsive Resident

Hello @YashnaChinta,

I was also facing this problem, and what I’ve noticed is that you can’t directly automate the destination of the source. (you have to modify manually the destination, that's sad)
However, in Power Query M, you can still access identifiers such as workspaceId or lakehouseId, and those can be parameterized.

📌Useful reference: Dataflow parameters

A few options to make this more flexible:

  • Use a variable library to define parameters for each environment (Dev → Prod). It's a good way to use with deployment pipelines : variable library

  • Integrate your Dataflow Gen2 with public parameters in a pipeline, so you can automate the entire processing of your data.

This way, you gain both reusability and consistency across environments.

 

Hope it can help you ! 

Best regards,

Antoine

v-achippa
Community Support
Community Support

Hi @YashnaChinta,

 

Thank you for reaching out to Microsoft Fabric Community.

 

Thank you @shashiPaul1570_ and @tayloramy for the prompt response. 

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

Hi @YashnaChinta,

 

Thank you @AntoineW for the response. 

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

tayloramy
Skilled Sharer
Skilled Sharer


Hi @YashnaChinta

Deployment pipelines don’t currently have a rule that automatically remaps a Dataflow Gen2 destination to a different Lakehouse name when moving from Dev -> Test -> Prod. Pipelines will copy the item and keep the same destination if the Lakehouse name is consistent across environments, but there isn’t a way to swap it during deployment.

You have a few options:

Keep Lakehouse names consistent across stages - if the Lakehouse items in Dev, Test, and Prod share the same name (just in different workspaces), the pipeline can pair them directly without manual edits.

Use Dataflow Gen2 CI/CD & Git integration - Dataflow Gen2 now supports CI/CD and Git deployment pipelines, but the destination (Lakehouse or Warehouse) still has to be set correctly in each environment if the names differ.

Automate a post-deploy update - if standardizing names isn’t possible, you can script a post-deployment step with the Fabric REST APIs or PowerShell to retarget the Dataflow Gen2 destination to the right Lakehouse in Prod. This is a common workaround until Microsoft extends deployment rules to cover destinations.

Manual retargeting (last resort) - open the deployed Dataflow Gen2 in the Prod workspace and manually re-select the Lakehouse destination if nothing else is feasible.



 

If you found this helpful, consider giving kudos. If I answered your question or solved your problem, mark this post as the solution. 

shashiPaul1570_
Resolver III
Resolver III

Hi @YashnaChinta,

Great question — this is a common scenario with Dataflow Gen2 in Fabric deployment pipelines. By default, the dataflow carries over the Dev Lakehouse destination when promoted, so you’ll need to configure a rule to swap it for Prod.

You have a few options:

  1. Deployment Rules (Recommended)

    • In your deployment pipeline, set up a rule so that when you promote, the Lakehouse destination in the dataflow is replaced automatically.

    • Example: Dev → DevLakehouse, Prod → ProdLakehouse.

  2. Parameterize the Lakehouse

    • In Dataflow Gen2, define a parameter (e.g., LakehouseName) for the destination.

    • Override the parameter in each environment via deployment pipeline rules.

  3. Consistent Naming

    • If your Lakehouses have the same name in Dev/Test/Prod (e.g., all called MainLakehouse but in different workspaces), then the deployment pipeline will map correctly without extra rules.

👉 This way, you won’t need to manually edit the dataflow after deployment — the pipeline ensures it always points to the right Lakehouse for that environment.

For more information, you can also follow these links:
https://www.red-gate.com/simple-talk/blogs/microsoft-fabric-moving-dataflows-gen-2-to-different-work...

 

https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/create-rules?utm

 

 

Hope this helps 🙌. If it does, please give a Kudos 👍 and mark it as Accepted Solution so others can benefit.

Best regards,
Shashi Paul

Helpful resources

Announcements
August Fabric Update Carousel

Fabric Monthly Update - August 2025

Check out the August 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.