The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I am working in a notebook in Fabric using PySpark, and have recently run into an error when running a specific code cell (that normally works). My kernel simply disconnects, and I can't proceed with my code.
I am using Fabric within my organization, so I have access to limited metrics, as I do not have admin rights.
I suspected it was capacity-related but a colleague of mine has already attempted to create a bigger spark pool. Unfortunately, this did not resolve it.
I get the following error right under the code cell:
And under diagnostics, I get the following:
Diagnostic ID: e0d1cb8a-823d-48c8-b878-478e7da536d3
Timestamp: 2025-05-26T13:53:44.169Z
Message: [object CloseEvent]
JSON
{
"type": "close",
"timeStamp": 583281.6999999285,
"code": 1000,
"reason": "{\"reason\":\"Session error or stopped.\",\"state\":\"session-completed\"}",
"wasClean": false,
"target": {
"url": removed,
"readyState": 3,
"protocolsProfile": [
7,
4588
]
},
"currentTarget": {
"url": removed,
"readyState": 3,
"protocolsProfile": [
7,
4588
]
},
"isTrusted": true
}
Additional info: InstanceId: 977dc982-d987-43eb-9104-373959539332
What causes this, and how can it be resolved?
Solved! Go to Solution.
Hi @stomori ,
This looks like a session-level failure that happens before your code even starts running — usually tied to Spark kernel startup or resource allocation issues.
Here’s what might be causing it and what you can try:
Session Timeout or Idle Expiry
If the notebook was idle for a while, the session might have expired silently. Try restarting the notebook kernel and re-running the cell immediately.
Spark Pool Resource Limits
Even if your colleague increased the pool size, check if:
Code Cell Content
If the cell has heavy operations (e.g. large joins, wide transformations), try:
Plugin State: Cleanup
This usually means the session failed during init and Fabric is cleaning up. It could be a transient backend issue — try running the same cell in a new notebook or after a short wait.
Diagnostics
Since you don’t have admin rights, ask your admin to check:
Let me know if you want help reviewing the code in that cell — sometimes a small tweak can avoid triggering these session-level errors.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
Hi @stomori
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @stomori ,
This looks like a session-level failure that happens before your code even starts running — usually tied to Spark kernel startup or resource allocation issues.
Here’s what might be causing it and what you can try:
Session Timeout or Idle Expiry
If the notebook was idle for a while, the session might have expired silently. Try restarting the notebook kernel and re-running the cell immediately.
Spark Pool Resource Limits
Even if your colleague increased the pool size, check if:
Code Cell Content
If the cell has heavy operations (e.g. large joins, wide transformations), try:
Plugin State: Cleanup
This usually means the session failed during init and Fabric is cleaning up. It could be a transient backend issue — try running the same cell in a new notebook or after a short wait.
Diagnostics
Since you don’t have admin rights, ask your admin to check:
Let me know if you want help reviewing the code in that cell — sometimes a small tweak can avoid triggering these session-level errors.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
Hi Burak,
I have now tried step 4 of running the cell in a new notebook, and this seems to resolve the issue.
Thanks a lot for your response!
User | Count |
---|---|
14 | |
9 | |
5 | |
3 | |
2 |
User | Count |
---|---|
45 | |
22 | |
17 | |
13 | |
12 |