Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.

Reply
mikesmall
Advocate I
Advocate I

Gen2 dataflows and deployment pipelines

Hi

 

We are looking to move from legacy dataflows to Gen2, but in my testing I can't see any Gen2 dataflows in the deployment pipeline. 

 

I wonder if it is because lakehouses are not suported yet and they require staging in order to be published, so need the system generated staging lakehouse.  I didn't set a destination in the dataflow to keep it simple for testing.  I also tried recreating the dataflows in Dev and Test, hoping that if the same dataflow existed in both it might then deploy (as the staging lakehouses had been created) but will can't see any Gen2 artifacts in the pipeline.

 

Has anyone managed to use deployment pipelines with Gen2 dataflows?

 

Thanks.

1 ACCEPTED SOLUTION
Poojara_D12
Super User
Super User

Hi @mikesmall 

As of now, Gen2 dataflows (Dataflows Gen2) are not yet fully supported in Power BI Deployment Pipelines, which is likely why you're not seeing them show up in your pipeline environments during testing. Deployment pipelines currently focus on semantic models, reports, and dashboards, and do not automatically include Gen2 dataflows—especially those not tied to a specific destination like a Lakehouse or Warehouse. Since Gen2 dataflows rely on Microsoft Fabric's OneLake infrastructure, their integration with deployment pipelines is still evolving and subject to future roadmap updates.

Furthermore, Gen2 dataflows often require a staging destination (usually a Lakehouse or Warehouse) to properly materialize and store their output, which is not just a convenience—it's essential for how they function and persist data. If you left the destination unset for simplicity, the dataflow wouldn't create or associate with any deployable artifact (like a Lakehouse table), meaning there's nothing concrete for the pipeline to pick up or compare across environments.

In practice, if you're testing Gen2 dataflows and want to prepare for future deployment pipeline integration, it's recommended to assign an explicit destination (e.g., a Lakehouse in your Fabric workspace). Also, monitor the Fabric release roadmap or official Microsoft documentation closely, as support for deploying Gen2 dataflows through pipelines is a highly requested feature and may be included in future updates.

In the meantime, deployment of Gen2 dataflows across environments typically needs to be done manually (e.g., exporting and importing definitions), or automated using APIs or Fabric CLI once such tools support the feature.

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS

View solution in original post

16 REPLIES 16
Poojara_D12
Super User
Super User

Hi @mikesmall 

As of now, Gen2 dataflows (Dataflows Gen2) are not yet fully supported in Power BI Deployment Pipelines, which is likely why you're not seeing them show up in your pipeline environments during testing. Deployment pipelines currently focus on semantic models, reports, and dashboards, and do not automatically include Gen2 dataflows—especially those not tied to a specific destination like a Lakehouse or Warehouse. Since Gen2 dataflows rely on Microsoft Fabric's OneLake infrastructure, their integration with deployment pipelines is still evolving and subject to future roadmap updates.

Furthermore, Gen2 dataflows often require a staging destination (usually a Lakehouse or Warehouse) to properly materialize and store their output, which is not just a convenience—it's essential for how they function and persist data. If you left the destination unset for simplicity, the dataflow wouldn't create or associate with any deployable artifact (like a Lakehouse table), meaning there's nothing concrete for the pipeline to pick up or compare across environments.

In practice, if you're testing Gen2 dataflows and want to prepare for future deployment pipeline integration, it's recommended to assign an explicit destination (e.g., a Lakehouse in your Fabric workspace). Also, monitor the Fabric release roadmap or official Microsoft documentation closely, as support for deploying Gen2 dataflows through pipelines is a highly requested feature and may be included in future updates.

In the meantime, deployment of Gen2 dataflows across environments typically needs to be done manually (e.g., exporting and importing definitions), or automated using APIs or Fabric CLI once such tools support the feature.

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS

Thanks Poojara.

 

Since that post I have now started using Pipelines to extract data into a bronze Lakehouse and Notebooks to transform data, even API calls are being done in Notebooks wherever possible.  I don't think I actually use any Dataflows anymore.

 

I do have some on prem API calls that need to use a Gateway so can't be done in a Notebook (I need to vote for Notebooks being able to use Gateways!) but can still run them as a copy job in a Pipeline.

mrozzano
Advocate I
Advocate I

Good news. Dataflows Gen2 with CI/CD (preview) are supported in a deployment pipeline.

 

"Dataflow Gen2 now supports Continuous Integration/Continuous Deployment (CI/CD) and Git integration [...] Additionally, you can use the deployment pipelines feature to automate the deployment of dataflows from your workspace to other workspaces. [...]"

 

See Dataflow Gen2 with CI/CD and Git integration - Microsoft Fabric | Microsoft Learn

Hi 

 

Are there deployments pipeline stages ( test / prod) rules to modify source or target connections and to modify dfgen2 parameters ?

--

How about :

- copy job change

-Pbi paginated reports sourcing change

- shortcuts target change

on deployments pipelines ? 

--

Data factory pipeline can use the new librairies variable (on preview) 

Just a shame you can't refresh these CI/CD flows yet from a pipeline... pretty limited when you have two dozen flows with all kinds of dependencies (for which the pipeline is very neat).

Joelon
New Member

Is there any alternative to overcome this shortcoming?

abpgupta
Continued Contributor
Continued Contributor

I am wondering then what will be approach for separating Development to Production for all related objects. Some managed with deployment pipeline and some with git integration?

cyibbs
Frequent Visitor

Did you ever figure this out? I'm in the same boat. Trying to transition to Gen2 DF's and just realized I can't see them in my existing deployment pipeline. 😕

As of June 2024, Gen2 data flows are still not supported in deployment pipelines. I also cannot find a mention on any roadmap.


Can anyone confirm when this feature is expected? We cannot take a system into production that uses Gen2 flows without deployment pipeline support.


Thanks!

Second this - could not recommend moving to production until there is full support for deployment (one way or another) for DFG2.

Just adding this from reddit where the same question was asked https://www.reddit.com/r/MicrosoftFabric/comments/1dogga0/dataflows_gen2_with_deployment_pipelines/

TLDR: Gen2 Dataflows with Deployment Pipelines isn't in the current roadmap which goes up until October. Check back then to get the next 6 month roadmap (which will hopefully have this feature included!).

I was in some Microsoft Azure training on Tuesday and asked the question, the confirmed it was not possible and there was no timeline.  Seems to be a massive oversight and a real barrier in moving to Fabric.

Exactly in the same situation.  Instead of the Dataflow Gen2 

 

alfBI_0-1700381752061.png

the depoyment pipeline shows the lakehouse used by the DF Gen2 for staging purposes......

 

What is more when we recreated the deployment pipeline following error message appeared

 

alfBI_0-1700382496674.png

 

 

So, seems clear that DF Gen2 are not yet supported by now (no idea if the will available after end of preview phase)

No, my guess is this is something they are working on for GA.

 

There is obviously an issue with the auto generated staging lakehouses for Gen2 dataflows (eg DataflowsStagingLakehouse) as they shouldn't be visible, so I suspect that could be the reason the artifacts can't be deployed with a pipeline yet.

abpgupta
Continued Contributor
Continued Contributor

This is still not available after GA.  

I have had a look at the roadmap and in Q2 2024 they are going to provide Git integration for dataflows, I wonder if this is the route they will go down for deploying content.  Would be useful to know if including Gen 2 DFs in deployment pipelines was even a consideration though.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.