Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now

Reply
anawast
Microsoft Employee
Microsoft Employee

mssparkutils.notebook.runMultiple doesn't return notebook Status if notebook fails

I would like to run multiple notebooks in parallel, and collect the status of each of the notebook runs. While using mssparkutils.notebook.runMultiple, as long as all notebooks succeed there are no issues. 

anawast_0-1723485238284.png

However, the failure of any one notebook results in the entire command failing and thus the inability to collect the result. 

anawast_1-1723485289705.pnganawast_2-1723485299759.png


Is there any way via which I can use mssparkutils.notebook.runMultiple to ensure that i collect the status of each notebook, even if there are failures that exist among the notebooks I am running in parallel? 

P.S: I would like to avoid putting all commands in the child notebook in try catch blocks and running notebook.exit commands on any exception. 

1 ACCEPTED SOLUTION
anawast
Microsoft Employee
Microsoft Employee

These don't run in parallel. But i guess I can multithread for that. Essentially use something like what is documented here: https://learn.microsoft.com/en-us/azure/databricks/_extras/notebooks/source/parallel-notebooks.html


View solution in original post

4 REPLIES 4
anawast
Microsoft Employee
Microsoft Employee

These don't run in parallel. But i guess I can multithread for that. Essentially use something like what is documented here: https://learn.microsoft.com/en-us/azure/databricks/_extras/notebooks/source/parallel-notebooks.html


Hi @anawast ,

 

Is this a workaround?

 

Can you accept your own answer as a solution? This would be a great help to people with similar problems.

 

Thank you for your cooperation.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

v-huijiey-msft
Community Support
Community Support

Hi @anawast ,

 

It is possible to run each subnotebook in the main notebook using mssparkutils.notebook.run and capture the state of each subnotebook in the main notebook.

 

The code is as follows:

def run_notebook(notebook_path):
    try:
        mssparkutils.notebook.run(notebook_path, 3600)
        return {"notebook": notebook_path, "status": "Success"}
    except Exception as e:
        return {"notebook": notebook_path, "status": "Failed", "error": str(e)}

notebooks = ["Notebook 2", "Notebook 8"]
results = [run_notebook(nb) for nb in notebooks]

# result
for result in results:
    print(result)

 

Here is a screenshot of my successful run:

vhuijieymsft_0-1723527472299.png

 

I've modified the code for notebook 8 a bit, here's a screenshot of one of the notebooks running and reporting an error, click on notebook 2 to see the result of a successful run.

vhuijieymsft_1-1723527493024.png

vhuijieymsft_2-1723527493027.png

 

If you have any other questions please feel free to contact me.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

This will not run concurrently will it? 

The benefit of run multiple is that it runs the notebooks in parallel 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

FebFBC_Carousel

Fabric Monthly Update - February 2025

Check out the February 2025 Fabric update to learn about new features.

Feb2025 NL Carousel

Fabric Community Update - February 2025

Find out what's new and trending in the Fabric community.