Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more

Reply
bhavya5903
Advocate I
Advocate I

How to Call Another Pipeline on Failure and Store Error Details in Lakehouse?

Question:

I am working with Microsoft Fabric Data Pipelines and need to implement a failure-handling mechanism.


Scenario:

  • I have a main pipeline with multiple activities (e.g., Copy Data, Set Variable, etc.).
  • If any activity fails, I want to trigger another pipeline (FailureHandlerPipeline).
  • I need to pass failure details dynamically, including:
    • FailedActivityName
    • ErrorMessage
    • ErrorCode
    • PipelineName
  • The FailureHandlerPipeline should store these details in a Lakehouse table for logging and troubleshooting.

Challenges:

  1. How do I dynamically capture failure details from multiple activities and pass them to the failure-handling pipeline?
  2. What’s the best approach to store these details in Lakehouse using a low-code method?
  3. Can I handle multiple failures at the same time? If multiple activities fail, will the failure-handling pipeline execute multiple times, or is there a way to batch the errors?

I appreciate any guidance, best practices, or alternative approaches!

1 ACCEPTED SOLUTION
nilendraFabric
Community Champion
Community Champion

Hello @bhavya5903 

 

Use individual `Set Variable` activities on each activity’s `Upon Failure` path to collect error details:

 


• `@activity('ActivityName').Error.message` (ErrorMessage)
• `@activity('ActivityName').Error.errorCode` (ErrorCode)
• `@activity('ActivityName').name` (FailedActivityName)
• `@pipeline().Pipeline` (PipelineName)

 

Add an Execute Pipeline activity on the `Upon Failure` path of the main pipeline. Pass the `errorDetails` array as a parameter to `FailureHandlerPipeline`.

Do something like this in notebook and invoke it in child pipeline 

 

from pyspark.sql import functions as F
error_df = spark.createDataFrame(error_batch)
error_df.write.mode("append").saveAsTable("error_logs")

 

 

Hope this helps.

 

 

Thanks

View solution in original post

4 REPLIES 4
nilendraFabric
Community Champion
Community Champion

Hello @bhavya5903 

 

Use individual `Set Variable` activities on each activity’s `Upon Failure` path to collect error details:

 


• `@activity('ActivityName').Error.message` (ErrorMessage)
• `@activity('ActivityName').Error.errorCode` (ErrorCode)
• `@activity('ActivityName').name` (FailedActivityName)
• `@pipeline().Pipeline` (PipelineName)

 

Add an Execute Pipeline activity on the `Upon Failure` path of the main pipeline. Pass the `errorDetails` array as a parameter to `FailureHandlerPipeline`.

Do something like this in notebook and invoke it in child pipeline 

 

from pyspark.sql import functions as F
error_df = spark.createDataFrame(error_batch)
error_df.write.mode("append").saveAsTable("error_logs")

 

 

Hope this helps.

 

 

Thanks

Hi @bhavya5903 ,

Has the response from @nilendraFabric  resolved your issue?

If it did, please consider marking the helpful reply as the accepted solution—this helps other community members with similar questions find answers more easily.

 

Thank you for being a valued member of the Microsoft Fabric Community Forum!

Hi @bhavya5903 ,

 

The answer posted by @nilendraFabric has resolved the issue? If yes, kindly mark the helpful answer as a solution if you feel that makes sense. Welcome to share your own solution. More people will benefit from the thread.

 

Should you have any further questions, feel free to reach out.

Thank you for being a part of the Microsoft Fabric Community Forum!

Hi @bhavya5903 ,

Has the response from @nilendraFabric  resolved your issue?

If it did, please consider marking the helpful reply as the accepted solution—this helps other community members with similar questions find answers more easily.

 

Thank you for being a valued member of the Microsoft Fabric Community Forum!

Helpful resources

Announcements
MarchFBCvideo - carousel

Fabric Monthly Update - March 2025

Check out the March 2025 Fabric update to learn about new features.

March2025 Carousel

Fabric Community Update - March 2025

Find out what's new and trending in the Fabric community.