Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I have a spark notebook which use a custom environment. I notice the scheduled run duration has doubled recently. I checked notebook run snapshot, it looks the actual execution time hasn't increased, so I suspect the duration increase because notebook connection time has increased significantly. How can I monitor notebook connection time during the past one week?
Solved! Go to Solution.
Hi @Jeanxyz,
You’re seeing the scheduled run duration grow while the notebook run snapshot still shows similar execution time. That typically means extra time is being spent before the first cell actually runs (queueing + Spark session startup/attach, often impacted by custom environments).
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @Jeanxyz
As we haven’t heard back from you, we wanted to kindly follow up to check if the suggestions provided by the community members for the issue worked. Please feel free to contact us if you have any further questions.
Thanks and regards
Sorry for late reply, I don't think there is a good solution from Microsoft. I have ticked the answer from Taloramy as an answer.
Hi @Jeanxyz
May I check if this issue has been resolved? If not, Please feel free to contact us if you have any further questions.
Thank you
Hi @Jeanxyz
To better get an idea, Apache Spark ( Data Lake) does not show running duration of the job .
Total Duration = It takes time to run the job. It varies on lot of other factors..
Also use best practices when using linux systems. ( Kimball methodology + Fact/Dim Tables structure etc...)
Hi @Jeanxyz
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions.
Thank you.
Hi @Jeanxyz ,
Thanks for reaching out to the Microsoft fabric community forum.
If you believe a lot of time is being spent in starting the spark session which gets automatically shut down after a period of 20 minutes, which is the default timeout time.
You can adjust the session timeout at the notebook or workspace level.
To change the timeout settings, open a notebook and start a session from the Connect toolbar menu. Once the session is active, click the Session ready indicator in the lower-left corner of the notebook, and in the dialog that appears you can adjust the timeout duration as needed.
I hope this information helps. Please do let us know if you have any further queries.
Thank you
Hi @Jeanxyz,
You’re seeing the scheduled run duration grow while the notebook run snapshot still shows similar execution time. That typically means extra time is being spent before the first cell actually runs (queueing + Spark session startup/attach, often impacted by custom environments).
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Thanks @tayloramy In monitor hub, I only find total duration, not the running duration.
As a quick fix, I create an empty notebook using the same Spark environment. In this way, the total duration of this notebook = connection duration. It looks the duration time is quite long, 7 minutes. This is ironically because I have only installed one public package in the library. It will save time if I run pip install each time rather than using a custom environmennt.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.
User | Count |
---|---|
13 | |
5 | |
4 | |
3 | |
2 |