Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
Mirdula
Frequent Visitor

Notebook Status - Stopped (Session Timed Out)

Hello everyone, 

Would appreciate your thoughts on the below. 

 

I had a notebook in the bronze layer, which I used to ingest data from an ERP system. Later, I created a duplicate of this file and configured it like its original. It runs perfectly, and there are no code errors (I checked the output delta tables, and it ingests new data as well). Anyhow, when I check the recent runs of this notebook, it shows "Running" for a long time, even after the last cell has run, and then it shows status as "Stopped (Session Timed Out). Anyhow, it only takes a few minutes for ingestion, and the spark session timeout is also set to 30 minutes. 

I thought this was something to do with the duplicate file; anyhow, another notebook in Silver, which runs perfectly, also faces the same situation. How can I fix this to have a "Success" status, given that there are no code errors and the ingestion is completed successfully?

1 ACCEPTED SOLUTION
Rufyda
Impactful Individual
Impactful Individual

Hi Mirdula 😊

Thank you for your detailed feedback.

What you're experiencing is quite common in Spark-based environments like Microsoft Fabric. Here's a quick clarification that may help:

When a notebook completes all cells successfully, but the Spark session doesn't close explicitly, the Fabric platform sometimes registers the session as "Stopped" instead of "Success" — even though no error occurred.

What you can try:

To ensure the session ends gracefully, you can manually stop the Spark session at the end of your notebook using:

 

spark.stop()

 

This helps Fabric detect that the notebook has completed intentionally, not due to timeout or inactivity.

Also, adding a cell like this at the end of your notebook might help signal a successful end:

 

print("Notebook completed successfully.")

 

While this doesn't guarantee a “Success” label in every case (as it depends on backend handling), it improves the chances — and keeps your logs cleaner.

Let me know if that helps!
And if the previous suggestion worked partially, feel free to mark it as solution and give a Kudo if it helped. 😄

View solution in original post

5 REPLIES 5
Rufyda
Impactful Individual
Impactful Individual

Hi Mirdula 😊

Thank you for your detailed feedback.

What you're experiencing is quite common in Spark-based environments like Microsoft Fabric. Here's a quick clarification that may help:

When a notebook completes all cells successfully, but the Spark session doesn't close explicitly, the Fabric platform sometimes registers the session as "Stopped" instead of "Success" — even though no error occurred.

What you can try:

To ensure the session ends gracefully, you can manually stop the Spark session at the end of your notebook using:

 

spark.stop()

 

This helps Fabric detect that the notebook has completed intentionally, not due to timeout or inactivity.

Also, adding a cell like this at the end of your notebook might help signal a successful end:

 

print("Notebook completed successfully.")

 

While this doesn't guarantee a “Success” label in every case (as it depends on backend handling), it improves the chances — and keeps your logs cleaner.

Let me know if that helps!
And if the previous suggestion worked partially, feel free to mark it as solution and give a Kudo if it helped. 😄

Hi Rufyda, 

 

This makes more sense and has solved my confusion. 

 

Thank you so much 😊

Rufyda
Impactful Individual
Impactful Individual

If I took the time to answer your question and I came up with a solution, please mark my post as a solution and /or give kudos freely for the effort 

Rufyda_0-1743761677886.png

 

 Thank you!

Proud to be a Super User!

Rufyda
Impactful Individual
Impactful Individual

 

1. Check Spark Session Timeout Settings:
Even though the Spark session timeout is set to 30 minutes, sometimes the default Spark configuration can cause the session to timeout if there is no activity (like reading or writing data). You can verify and adjust your session settings to ensure they align with the specific workload you're running.

How to Adjust Spark Timeout Settings:

Try modifying the Spark configuration within your notebook to allow for more leniency during the runtime.

Use the following command to explicitly set the timeout settings within the notebook itself:

 

spark.conf.set("spark.sql.broadcastTimeout", "3600") # Set timeout to 1 hour spark.conf.set("spark.network.timeout", "3600") # Set network timeout to 1 hour

If this solution works for you, feel free to like it and confirm that it’s suitable for your case.

Hi Rufyda, 

 

Thank you for your response. and now I can see that the status in the recent run list shows "stopped". My question is, earlier it showed "success" and from yesterday it showed "stopped (session timed out)" and after applying the code you gave me it shows "stopped". 

Is there a way to fix this to show "success" just like before? Have you faced anything similiar? 🙂

 

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.