Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
AnushShetty
Frequent Visitor

Retrieving Detailed Notebook-Level Error Logs When Triggering Notebooks Directly via REST API/FABCLI

We have started working on Microsoft Fabric for our enterprise solution.

 

At present, we are able to successfully trigger Fabric pipelines and notebooks (when wrapped within pipelines) using both the REST API and FAB CLI. Through these approaches, we can execute, monitor, and capture detailed error logs without any issues.

 

Challenge:

 

When triggering Fabric Notebooks directly using the REST API or FAB CLI, we are able to start and monitor the execution. However, in case of failures, we are only receiving generic job instance–level error messages {'errorCode': 'JobInstanceStatusFailed', 'message': 'Job instance failed without detail error'}. The actual notebook-level error details are not being returned.

 

Is there currently a supported way to retrieve detailed notebook-level error logs when executing notebooks directly via REST API or FAB CLI?

Or is this a known limitation when notebooks are triggered outside of a pipeline context?

Any guidance or documentation references would be greatly appreciated.


Workaround Implemented:
As a workaround, we modified the notebook to explicitly capture and store detailed error logs into a Lakehouse location.

We then accessed these logs using Storage APIs, and this approach is working for us.

However, this feels like a custom solution rather than a native capability.

1 ACCEPTED SOLUTION
deborshi_nag
Resident Rockstar
Resident Rockstar

Hello @AnushShetty 

 

At the moment, if you use the REST API or FAB CLI to run a notebook directly, you won’t get all the detailed notebook error logs in-line. You’ll see the job or run status and a general failure message, but not the same detailed logs you’d see in the notebook interface or pipeline Monitoring. This is a limitation when notebooks are run outside a pipeline.

 

If you want to get driver logs and Spark event logs for a notebook run, you’ll need to find the Livy session ID and Spark application ID first, then use the Spark monitoring tools to access them.

 

If your workspace is connected to Azure Log Analytics and Spark diagnostic emitters are switched on, notebook logs—including custom Log4j messages—will be sent to Log Analytics - see link below. 

 

Monitor Apache Spark applications with Azure Log Analytics - Microsoft Fabric | Microsoft Learn

 

I trust this will be helpful. If you found this guidance useful, you are welcome to acknowledge with a Kudos or by marking it as a Solution.

View solution in original post

5 REPLIES 5
AnushShetty
Frequent Visitor

Hi @deborshi_nag  @v-veshwara-msft ,

 

Thank you for the detailed explanation.It was very helpful.

While we are currently following a slightly different approach to log the exact error as mentioned in the main post, I appreciate you sharing this method. It is quite insightful.

v-veshwara-msft
Community Support
Community Support

Hi @AnushShetty 

We wanted to kindly follow up regarding your query. If you need any further assistance, please reach out.
Thank you.

Hi @deborshi_nag , @v-veshwara-msft ,

 

Thank you for the detailed explanation and clarification ,It was very helpful.

While we are currently following a slightly different approach to log the exact error as mentioned in the main post, I appreciate you sharing this method. It is quite insightful.

v-veshwara-msft
Community Support
Community Support

Hi @AnushShetty ,
Thanks for reaching out to Microsoft Fabric Community.

 

Just wanted to check if the response provided by @deborshi_nag was helpful. If further assistance is needed, please reach out.


Thank you.

deborshi_nag
Resident Rockstar
Resident Rockstar

Hello @AnushShetty 

 

At the moment, if you use the REST API or FAB CLI to run a notebook directly, you won’t get all the detailed notebook error logs in-line. You’ll see the job or run status and a general failure message, but not the same detailed logs you’d see in the notebook interface or pipeline Monitoring. This is a limitation when notebooks are run outside a pipeline.

 

If you want to get driver logs and Spark event logs for a notebook run, you’ll need to find the Livy session ID and Spark application ID first, then use the Spark monitoring tools to access them.

 

If your workspace is connected to Azure Log Analytics and Spark diagnostic emitters are switched on, notebook logs—including custom Log4j messages—will be sent to Log Analytics - see link below. 

 

Monitor Apache Spark applications with Azure Log Analytics - Microsoft Fabric | Microsoft Learn

 

I trust this will be helpful. If you found this guidance useful, you are welcome to acknowledge with a Kudos or by marking it as a Solution.

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.