Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hello everyone,
Would appreciate your thoughts on the below.
I had a notebook in the bronze layer, which I used to ingest data from an ERP system. Later, I created a duplicate of this file and configured it like its original. It runs perfectly, and there are no code errors (I checked the output delta tables, and it ingests new data as well). Anyhow, when I check the recent runs of this notebook, it shows "Running" for a long time, even after the last cell has run, and then it shows status as "Stopped (Session Timed Out). Anyhow, it only takes a few minutes for ingestion, and the spark session timeout is also set to 30 minutes.
I thought this was something to do with the duplicate file; anyhow, another notebook in Silver, which runs perfectly, also faces the same situation. How can I fix this to have a "Success" status, given that there are no code errors and the ingestion is completed successfully?
Solved! Go to Solution.
Hi Mirdula 😊
Thank you for your detailed feedback.
What you're experiencing is quite common in Spark-based environments like Microsoft Fabric. Here's a quick clarification that may help:
When a notebook completes all cells successfully, but the Spark session doesn't close explicitly, the Fabric platform sometimes registers the session as "Stopped" instead of "Success" — even though no error occurred.
To ensure the session ends gracefully, you can manually stop the Spark session at the end of your notebook using:
spark.stop()
This helps Fabric detect that the notebook has completed intentionally, not due to timeout or inactivity.
Also, adding a cell like this at the end of your notebook might help signal a successful end:
print("Notebook completed successfully.")
While this doesn't guarantee a “Success” label in every case (as it depends on backend handling), it improves the chances — and keeps your logs cleaner.
Let me know if that helps!
And if the previous suggestion worked partially, feel free to mark it as solution and give a Kudo if it helped. 😄
Hi Mirdula 😊
Thank you for your detailed feedback.
What you're experiencing is quite common in Spark-based environments like Microsoft Fabric. Here's a quick clarification that may help:
When a notebook completes all cells successfully, but the Spark session doesn't close explicitly, the Fabric platform sometimes registers the session as "Stopped" instead of "Success" — even though no error occurred.
To ensure the session ends gracefully, you can manually stop the Spark session at the end of your notebook using:
spark.stop()
This helps Fabric detect that the notebook has completed intentionally, not due to timeout or inactivity.
Also, adding a cell like this at the end of your notebook might help signal a successful end:
print("Notebook completed successfully.")
While this doesn't guarantee a “Success” label in every case (as it depends on backend handling), it improves the chances — and keeps your logs cleaner.
Let me know if that helps!
And if the previous suggestion worked partially, feel free to mark it as solution and give a Kudo if it helped. 😄
Hi Rufyda,
This makes more sense and has solved my confusion.
Thank you so much 😊
If I took the time to answer your question and I came up with a solution, please mark my post as a solution and /or give kudos freely for the effort
Thank you!
Proud to be a Super User!
1. Check Spark Session Timeout Settings:
Even though the Spark session timeout is set to 30 minutes, sometimes the default Spark configuration can cause the session to timeout if there is no activity (like reading or writing data). You can verify and adjust your session settings to ensure they align with the specific workload you're running.
How to Adjust Spark Timeout Settings:
Try modifying the Spark configuration within your notebook to allow for more leniency during the runtime.
Use the following command to explicitly set the timeout settings within the notebook itself:
If this solution works for you, feel free to like it and confirm that it’s suitable for your case.
Hi Rufyda,
Thank you for your response. and now I can see that the status in the recent run list shows "stopped". My question is, earlier it showed "success" and from yesterday it showed "stopped (session timed out)" and after applying the code you gave me it shows "stopped".
Is there a way to fix this to show "success" just like before? Have you faced anything similiar? 🙂
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!