Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
mebg123
Helper I
Helper I

Dataflow gen2 deployment best practices

Hello experts, I'm having a little trouble on how to properly deploy a Dataflow gen2 item by using the Deployment Pipelines. I know it is currently unsupported by either Azure DevOps, GitHub and Fabric's own Deployment Pipelines, so I wanted to know if there is any best practices to deploy this items.

 

I've tried to use the Power Query templates but it works half the way because the Data Destination set in the queires is gone. This creates potential human errors when setting the queries into the correct destination.

 

Also, when deploying the Pipeline, say from Dev to Test, all Dataflows gen2 that are called inherit the previous workspace (in this case Development), which other manual labor has to be done. Is there a way to dynamically define the workspace? To set the current workspace as the one feeding the Dataflow in the Pipeline?

 

Best regards

1 ACCEPTED SOLUTION
OktayPamuk80
Resolver II
Resolver II

Hi,

As far as I know, no, it is not possible yet, but in FabCon they told until end of year Dataflow gen2 support will come.

I suppose within the next 2 months native support will come. Question is, can you directly work on Prod, skipping DEV and Test and lates reverse engineer the whole.

Regards,

Oktay

 

Did I answer your question? Then please mark my post as the solution.

If I helped you, click on the Thumbs Up to give Kudos.

View solution in original post

6 REPLIES 6
OktayPamuk80
Resolver II
Resolver II

I am afraid, we need to wait and see. Try to stick to notebooks with Data Wrangler capabilities given additionally to Pyton (before it was only to Pandas). There you have, way more capabilities.

OktayPamuk80
Resolver II
Resolver II

Hi,

The way we did it it for having dynamic datasource based on DEV, TEST or PROD ist to do following:

- Use switch and as an expression use the Workspace ID (delivers @pipeline().DataFactory

- create for DEV, TEST and PROD a case with workspace ID to check

- as an activity have the same processing, however connecting to different source

 

This you can create in DEV and test it in TEST and PROD and should access the relating sources. Not a nice way, but works.

Regards,
Oktay

 

Did I answer your question? Then please mark my post as the solution.

If I helped you, click on the Thumbs Up to give Kudos.

So to do this, you insert the dynamic content in the dataflow's Power Query calling for the current workspace ID. But how you do it for the Dataflow's ID? because when you go from DEV to TEST you'll need to create a new dataflow hence a new (and unknown) dataflow ID

No, in DEV you already create the three dataflows for dev, test and prod. However in the pipeline you use switch to execute the relating dataflow. for example, you create a swith in the pipeline and add the workspace ID. If that's the case it will run the dataflows, which has the dev source and target in it. The same you do test and prod.

The workspace Id, you can get from the url, when you select a workspace (try dev and test, you can see that only the eng with UID changes.

 

OktayPamuk80
Resolver II
Resolver II

Hi,

As far as I know, no, it is not possible yet, but in FabCon they told until end of year Dataflow gen2 support will come.

I suppose within the next 2 months native support will come. Question is, can you directly work on Prod, skipping DEV and Test and lates reverse engineer the whole.

Regards,

Oktay

 

Did I answer your question? Then please mark my post as the solution.

If I helped you, click on the Thumbs Up to give Kudos.

Thanks! that's a relief to know they're giving support to dataflows gen2.

 

Also, is there any best practice to be aware when inserting dataflows inside a pipeline and then deploying to another workspace, the referenced workspace in the dataflow stays the same. Is there a way to dynamically change it so it reference the current workspace?

 

Thanks!

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.

Top Solution Authors