Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
Matthew_G
Frequent Visitor

Dynamic pipelines to use across workspaces

Hi fellow Fabricators,

 

The set-up:

Currently you can't use a deployment pipleline to deploy data pipelines. We have three different workspaces, basically dev, test, prod - we did it this way to make use of the deployment pipline but we are encountering some issues.

I have a data pipleline that executes a Notebook and then executes some stored procs to process data from the Lakehouse to the Warehouse. Each workspace has its own Lakehouse/Warehouse with the same name. 

 

The challenge:

I don't want to have to re-create the pipleline multiple times so I though of possibly creating it once and using variables/scope variables instead. 

 

For a Notebook its easy enough to make use of a variable - I know I could use @pipeline().DataFactory for the Workspace object ID instead. In any case this seems to work fine.

Matthew_G_0-1701941696651.png

 

For a stored proc however I would need the Warehouse ID and I am not sure how to go about getting that.

Matthew_G_1-1701941868604.png

 

Has someone found a solution to this? Or is there a better way to apply changes accross workspaces?

 

Thanks!

Regards,

Matt

 

2 ACCEPTED SOLUTIONS
v-nikhilan-msft
Community Support
Community Support

Hi @Matthew_G 
Thanks for using Fabric Community.
You need to select the "Add dynamic content" option within the drop down and pass in the GUID.

vnikhilanmsft_4-1701967261838.png

 


You need to create a new parameter for the stored procedure activity.
The parameter value should be Lakehouse/Data Warehouse/KQL Database object ID. To get your Lakehouse/Data Warehouse/KQL Database object ID, open your Lakehouse/Data Warehouse/KQL Database in your workspace, and the ID is after /lakehouses/or /datawarehouses/ or /databases/ in your URL.

 

vnikhilanmsft_3-1701967205140.png

 


Please refer to this link for more information: Link1
Hope this helps. Please let me know if you have any further questions.

View solution in original post

Thanks, it looks a bit different for the Stored Procedure Activity.

When you add dynamic content to the Warehouse it then also requests a SQL connection string.

I then added two parameters, the connection string and the warehouse guid, adding it as dynamic content.

This now works.

 

Matthew_G_1-1702028794775.png

 

 

 

View solution in original post

9 REPLIES 9
JerryPan
New Member

@Matthew_G , @v-nikhilan-msft ,

I have added the same parameters as Warehouse ID, SQL connection string and use them in Stored Procedure Activity. It works well in current Dev workspace. But when I use release pipeline deploy it to next stage workspace (UAT), the parameters are still pointing to Dev workspace warehouse. How can we change these parameters during deployment?

 

Thanks

 

jerry

 

Hi Jerry, yeah, you will need to change the parameters in each pipeline though each time you deploy they will be overwritten. Unfortunately even in the deployment pipeline you can't set the parameters. What you could do is have two pipelines in each workspace where pipeline 1 kicks off pipeline 2 and passess the parameters. That way when you deploy changes you can just deploy pipeline 2 and it shouldnt break anything. I'm sure there is a better solution out there.

HappyFabric
Employee
Employee

 

I would like to do something similar with referencing my local lakehouse. I want to have lakehouses of the same name in each of my testing, staging, and prod workspaces.  How can I reference the lakehouse of the name "My_Lakhouse" in this local workspace?  Something like  My_Lakehouse.@pipeline().DataFactory   ?

Found an example for Lakehouse here: Parameters - Microsoft Fabric | Microsoft Learn

paul_hasell
New Member

Hi, have you found a way to handle upgrades to the pipeline(s)? Although you can copy across workspaces you can't overwrite an existing pipeline so with this and not being able to 'release' DataFlows we're currently sticking with ADF/Synapse for all bar the simplest requirements (we have some customer integrations with >50 complex pipelines)

Unfortunately not really. For the moment I make the changes to the pipeline in dev and then rename/delete the previous version in the test etc environments and save as the new pipeline then update the variables in the pipeline. So there is still quite a bit of admin needed. Ideally it would be great if you could deploy data pipelines or at least the ability to create workspace environmental variables and use them in the pipelines.

I think it's going to get caught in the trap of referencing by id/guid rather than name so there'll always be an issue when moving between workspaces. They may need to take a leaf from the Power Apps book and have some kind of 'solution' package so that related items are managed together to maintain referential 'integrity'

v-nikhilan-msft
Community Support
Community Support

Hi @Matthew_G 
Thanks for using Fabric Community.
You need to select the "Add dynamic content" option within the drop down and pass in the GUID.

vnikhilanmsft_4-1701967261838.png

 


You need to create a new parameter for the stored procedure activity.
The parameter value should be Lakehouse/Data Warehouse/KQL Database object ID. To get your Lakehouse/Data Warehouse/KQL Database object ID, open your Lakehouse/Data Warehouse/KQL Database in your workspace, and the ID is after /lakehouses/or /datawarehouses/ or /databases/ in your URL.

 

vnikhilanmsft_3-1701967205140.png

 


Please refer to this link for more information: Link1
Hope this helps. Please let me know if you have any further questions.

Thanks, it looks a bit different for the Stored Procedure Activity.

When you add dynamic content to the Warehouse it then also requests a SQL connection string.

I then added two parameters, the connection string and the warehouse guid, adding it as dynamic content.

This now works.

 

Matthew_G_1-1702028794775.png

 

 

 

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayFBCUpdateCarousel

Fabric Monthly Update - May 2024

Check out the May 2024 Fabric update to learn about new features.