Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Hi Fabric Community,
I’m working on a real-time orchestration scenario where I need to run multiple pipelines (e.g., Pipeline A → B → C) only if the previous one succeeds. If any pipeline fails, the chain should stop automatically.
Constraints:
Solved! Go to Solution.
Hi @Suryyyaaaa ,
Thanks for reaching out to the Microsoft fabric community forum.
You can use the Fabric Invoke pipeline activity to execute another Microsoft Fabric pipeline. You can use it to orchestrate the execution of one or multiple pipelines from within a single pipeline.
If you set up such an architecture, Pipeline 2 will run only once pipeline 1 is successful
and pipeline 3 will run when pipeline 1 and 2 are successful.
Invoke pipeline activity - Microsoft Fabric | Microsoft Learn
I hope this information helps. Please do let us know if you have any further queries.
Thank you
Hi @Suryyyaaaa
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions.
Thank you.
Hi @Suryyyaaaa ,
Thanks for reaching out to the Microsoft fabric community forum.
You can use the Fabric Invoke pipeline activity to execute another Microsoft Fabric pipeline. You can use it to orchestrate the execution of one or multiple pipelines from within a single pipeline.
If you set up such an architecture, Pipeline 2 will run only once pipeline 1 is successful
and pipeline 3 will run when pipeline 1 and 2 are successful.
Invoke pipeline activity - Microsoft Fabric | Microsoft Learn
I hope this information helps. Please do let us know if you have any further queries.
Thank you
What do i need to do is i need to make pipeline run automatically if group one pipeline run is succesfull where then i don't need to schedule manually if i trigger the grp 1 pipeline then it goes seqeuntially grp 1 passes then grp 2 passes like that if any of the grp pipeline fails i need to be stopped that's what i need to do so please understand and give me a answer
where you are idea of involing pipeline is the concept of where the pipeline runs for exmple if i have three pipeline each represent a group if group completed like completion then the pipeline 2 runs in the invoke pipeline concepts where i need is pipeline 1 is successfull then only i need to go run the second pipeline you can understand what i am asking
Hi @Suryyyaaaa ,
In the architecture I shared with you
the pipeline 2 will run only when pipeline 1 is a success because we are using the on success connector.
I hope this information helps. Please do let us know if you have any further queries.
Thank you
Hi @Suryyyaaaa,
You can achieve this by calling fabric pipelines in ADF using web activity.
You can do it either by using Service Principal or Managed Identity of ADF.
1. In Fabric admin portal, allow service principals to call Fabric API.
2. Add ADF/SPN as Contributer in Fabric workspace.
3. In ADF use web activity and create below url with your workspaceId and pipelineid
Authentication: System Assigned Managed Identity
Resource: https://api.fabric.microsoft.com/
4. After some timeout you can check the status of the fabric pipeline using below format. (This is web2 activity which is used to check status of previous call)
Flow:
web1->until_web2_gets_output(wait->web2)->onsuccess->web3(2nd pipeline)
you can configure next pipeline based on outputof previous web activity which checks the status of 1st pipeline run. And take advantage of ADF pipeline scheduling.
Hope this helps, If you find this usefull do give a Kudos and let me know if you have any doubts.
Please Mark as Answer If this was solves your problem.
Thanks & regards,
Bharath Kumar S
Why you have to do this. Is there a special reason to perform Data Pipelines ( ETL ). Everything runs on Linux Operating System ( Delta Lake or Fabric Data Factory )
In Lakehouse, All ETL performed through single delta table at a time and it is linux system.
Fabric Data Factory = Fabric Data Warehouse = Semantic Data Model = Delta Lake (Lakehouse)
Also, it is only possible through scheduling. Without scheduling or file based, it is not feasible.
What do i need to do is i need to make pipeline run automatically if group one pipeline run is succesfull where then i don't need to schedule manually if i trigger the grp 1 pipeline then it goes seqeuntially grp 1 passes then grp 2 passes like that if any of the grp pipeline fails i need to be stopped that's what i need to do so please understand and give me a answer
User | Count |
---|---|
4 | |
4 | |
2 | |
2 | |
2 |
User | Count |
---|---|
10 | |
8 | |
7 | |
6 | |
6 |