Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Aj_Singh
Regular Visitor

Invoke Pipeline (Preview) - Managing Pipeline IDs Across Workspaces

Scenario:

We have a master pipeline in Microsoft Fabric that dynamically calls worker pipelines by passing the Pipeline ID (GUID), which is retrieved from a SQL Server database. This setup works perfectly fine when running within the same workspace. I am using Invoke Pipeline (Preview) activity in the master pipeline to call the worker pipeline and it uses dynamic content to pass the correct Pipeline ID (GUID):

MasterPipeline.JPG

 

 

 

 

 

 

 

 

 

Issue:

When the changes are propagated to other workspaces via GIT, the Object ID (GUID) for the worker pipelines changes, causing the master pipeline to fail because the ID no longer matches the worker pipeline in the new workspace.

 

Concerns:

  • Pipeline ID changes when moving worker pipelines to different workspaces, Need for a solution to dynamically pass the correct Pipeline ID for each workspace without hardcoding or constant updates

How can we dynamically resolve and pass the correct Pipeline ID for worker pipelines across different workspaces? Is there best practices or automated ways to fetch Pipeline IDs dynamically (via API, Azure DevOps, etc.) for cross-workspace invocations

 

1 ACCEPTED SOLUTION
Aj_Singh
Regular Visitor

Thanks for the input, have applied a workaround to capture the workspace details using REST API and capture that info. in the backend table and utilize that to share it with Invoke Pipeline (Preview) dynamically. 

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

Hi @Aj_Singh ,

 

Thanks for the solution idea, it's valuable.

 

Best regards,

Adamk Kong

Aj_Singh
Regular Visitor

Thanks for the input, have applied a workaround to capture the workspace details using REST API and capture that info. in the backend table and utilize that to share it with Invoke Pipeline (Preview) dynamically. 

spencer_sa
Super User
Super User

I can think of 2-3 ways of maintaining pipeline ids across workspaces - this is a very similar problem to when you have DEV, TEST, and PROD environments and need to make sure lakehouse ids move properly during deployments.

 

  1. Probably the worst case from a maintenance point of view, but have a table in each workspace that holds lookups between pipeline name and pipeline id.  Create a sub-pipeline that takes a pipeline name as a parameter and returns a pipeline id.  This table could be periodically populated using API calls similar to method 2 without the filtering and output to a table.
  2. Have a notebook that uses the library sempy_labs​ and specifically the function sempy_labs.admin.list_items​() to get a list of items, filter on Type = DataPipelines.  Have the notebook take the pipeline name in as a parameter and output the pipeline id as its exitValue

    spencer_sa_1-1730974780531.png

    https://github.com/microsoft/semantic-link-labs

  3. Same as 2., just write the API calls yourself.  (Mostly for when you can't install sempy_labs).  I won't demo this here, but sempy_labs just wraps these API calls so it works the same way.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors