March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi I'm runing spark notebooks on Fabric trial, it's been working great and no time out issues, or severe dalys in sessions starting.
But today, notebooks do not start the spark session, returning this type of error:
"Livy session has failed. Error code: SparkCoreError/SessionDidNotEnterIdle. SessionInfo.State from SparkCore is Error: Session did not enter idle state after 10 minutes. Source: System."
I see from the forum that others have experience similar issues, with severe delays in functionality.
Can anyone shed more light on this?
Thanks,
Solved! Go to Solution.
This issue persisted for a week. I updated to runtime 1.3 at the time and it had no effect on the issue. MS stated at the time it was a resource allocation issue. Since 07-18-2024 I have not experienced this issue with my pipelines. For now it seems MS has resolved the issue. (Central Canada)
Australia region here, started getting this error in the morning when running a pipeling manually. The automated 5 am run was fine, trying the same pipeline manually around 9 am caused the error.
This issue persisted for a week. I updated to runtime 1.3 at the time and it had no effect on the issue. MS stated at the time it was a resource allocation issue. Since 07-18-2024 I have not experienced this issue with my pipelines. For now it seems MS has resolved the issue. (Central Canada)
Hi I have the same issue today(30-JUL-2024), in US South Central Region ("Notebook execution failed at Notebook service with http status code - '200', please check the Run logs on Notebook, additional details - 'Error name - Exception, Error value - Failed to create Livy session for executing notebook.")
Hi Again,
Today(31-JUL-2024), I have the same issue("SparkCoreError/SessionDidNotEnterIdle: Livy session has failed. Error code: SparkCoreError/SessionDidNotEnterIdle. SessionInfo.State from SparkCore is Error: Session did not enter idle state after 10 minutes. Source: System.").
I don't know if this behavior is related to my size capacity(F2), but at run time the capability does not execute any other processes.
Thank you.
I'm having the same error too. Hopefully they will fix this issue asap.
Hi, This problem persists using the stable version of Spark (1.2), however since I adjusted the Spark version to 1.3v (preview) the Notebooks now run without problems.
I sincerely hope that the stable version 1.2 really works stably and we can keep it and run our production processes with confidence.
Hi, it might be related to the big issues happening world wide on Azure netowrok infrastructure:
https://azure.status.microsoft/it-it/status
Hi all,
We have reported this issue and submitted it to the product team.
They have been aware of the issue and fixed it. Please check it later.
Best Regards,
Ada Wang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @JFTxJ ,
Thank you for sharing and we really look forward to your status updates.
Hi all,
The incident is still under investigation and the team is checking for changes or updates that may have caused the issue. Please be patient.
Best Regards,
Ada Wang
Hey everyone,
I just had some resolution on this issue by enabling runtime 1.3 (Preview) on my workspace. I have restarted a pipeline with hundreds of Notebook executions and they all seem to be running smoothly with the newer (Preview) runtime.
Of course this could just be a convenient glitch, so I will comment back tomorrow on the progress, but it looks promising to me as I was able to reproduce the issue with a manual execution of a notebook, and that has also been working since I made the config change.
If you are able to reproduce the issue while running Notebooks manually, I recommend experimenting with a custom Notebook environment set to runtime 1.3 and see if you get any better results.
I will follow up tomorrow with a status update.
Thank you
i also have the same solution applied and data pipeline now works successfully on 1.3 spark.
I've just tried the 1.3 change and got a Pandas error. Further reading this morning suggests I'm hitting memory issues based on the size of the data frame I'm outputting... Have edited this post as my code is the issue here, not Fabric 🙂
FYI: I got an update from MS last friday that this issue should be resolved. So we should be able to revert back to runtime 1.2.
I have not tested this yet, but just passing on the message.
So just to reply to myself... I've had other pipelines calling notebooks over the weekend - default is now 1.3 - and they've worked ok (and have calls to pandas to output files).
I've not yet reverted to 1.2 to test it, but can confirm that 1.3 is working.
Looks like something strange is going on with this particular notebook...
Anyone still experiencing this issue in North Europe?
Issue seems to be resolved for Central Canada.
Hope it's working for you guys too!
I'm having the same error too , from middle east.
I am having the same issue in Brazil
We have been experiencing the same problem since Monday. We struggle to launch a notebook, succeeding only once out of 20 attempts, even with multiple retries. We are really having issues with Fabric Notebook over the past two days (France).
Response from Microsoft relating to my P1 ticket for the issue. They say its mitigated but still have an issue.
Really struggling with Microsoft's shocking support.
North Europe region. Still not working as of 20 mins ago
STATUS:
Mitigated
SUMMARY OF IMPACT:
Our production team has applied the mitigation steps, and the error trend came down around 7/18 1:30 AM UTC.
We will be continuously monitoring the trend and act accordingly if needed. We'll send an update once we know more.
In the meanwhile, please contact us if you still encounter the same issue.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
User | Count |
---|---|
3 | |
3 | |
3 | |
1 | |
1 |
User | Count |
---|---|
12 | |
6 | |
5 | |
5 | |
5 |