Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
alejandroferre
Regular Visitor

Azure DataBricks Job activity in DataPipeline doesn't change status from pending to failure when the

Azure DataBricks Job activity in DataPipeline doesn't change status from pending to failure when the job run in Databricks finish in status succeded with failures

alejandroferre_0-1746027781511.pngalejandroferre_1-1746027809697.png

 

1 ACCEPTED SOLUTION
v-csrikanth
Community Support
Community Support

Hi @alejandroferre 

To make ADF recognize "Succeeded with failures" as a failure, you can add logic in your ADF pipeline to check the Databricks job run’s detailed status after the activity completes. If the status is "Succeeded with failures," you can explicitly fail the pipeline.

Modify Databricks Job to Fail on Task Failures

 

  1. Add the Python code to your Databricks job (e.g., last notebook/task) to check for task failures and fail the job: 
    from databricks.sdk import WorkspaceClient
    w = WorkspaceClient()
    run_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().runId().get()
    run = w.jobs.get_run(run_id=run_id)
    has_failures = any(task.state.result_state == "FAILED" for task in run.tasks)
    if has_failures:
    raise Exception("Job succeeded with failures.")
  2. Keep your existing Databricks activity (e.g., RunDatabricksJob) in the ADF pipeline unchanged.
  3. Run the ADF pipeline.
    • If the Databricks job has task failures, it will throw an exception and fail.
    • ADF will mark the activity as "Failed," and the pipeline will fail.


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.

Best Regards,
Community Support Team _ C Srikanth.

 

View solution in original post

4 REPLIES 4
v-csrikanth
Community Support
Community Support

Hi @alejandroferre 

We haven't heard from you since last response and just wanted to check whether the solution provided has worked for you. If yes, please Accept as Solution to help others benefit in the community.
Thank you.

If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

v-csrikanth
Community Support
Community Support

Hi @alejandroferre 

It's been a while since I heard back from you and I wanted to follow up. Have you had a chance to try the solutions that have been offered?
If the issue has been resolved, can you mark the post as resolved? If you're still experiencing challenges, please feel free to let us know and we'll be happy to continue to help!
Looking forward to your reply!

Best Regards,
Community Support Team _ C Srikanth.

v-csrikanth
Community Support
Community Support

Hi @alejandroferre 

I wanted to follow up since I haven't heard from you in a while. Have you had a chance to try the suggested solutions?
If your issue is resolved, please consider marking the post as solved. However, if you're still facing challenges, feel free to share the details, and we'll be happy to assist you further.
Looking forward to your response!

Best Regards,
Community Support Team _ C Srikanth.

v-csrikanth
Community Support
Community Support

Hi @alejandroferre 

To make ADF recognize "Succeeded with failures" as a failure, you can add logic in your ADF pipeline to check the Databricks job run’s detailed status after the activity completes. If the status is "Succeeded with failures," you can explicitly fail the pipeline.

Modify Databricks Job to Fail on Task Failures

 

  1. Add the Python code to your Databricks job (e.g., last notebook/task) to check for task failures and fail the job: 
    from databricks.sdk import WorkspaceClient
    w = WorkspaceClient()
    run_id = dbutils.notebook.entry_point.getDbutils().notebook().getContext().runId().get()
    run = w.jobs.get_run(run_id=run_id)
    has_failures = any(task.state.result_state == "FAILED" for task in run.tasks)
    if has_failures:
    raise Exception("Job succeeded with failures.")
  2. Keep your existing Databricks activity (e.g., RunDatabricksJob) in the ADF pipeline unchanged.
  3. Run the ADF pipeline.
    • If the Databricks job has task failures, it will throw an exception and fail.
    • ADF will mark the activity as "Failed," and the pipeline will fail.


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.

Best Regards,
Community Support Team _ C Srikanth.

 

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.