The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi Fabricators!
Our daily run gets sometimes hit by the following error while trying to run mssparkutils.notebook.runMultiple():
Notebook execution failed at Notebook service with http status code - '200', please check the Run logs on Notebook, additional details - 'Error name - InvalidHttpRequestToLivy, Error value - Submission failed due to error content =["requirement failed: Session isn't active."] HTTP status code: 400. Trace ID: 321c0c00-5823-4047-afb2-b9990fea8b923.' :
(PS: there is no additional logging).
Sometimes we run into:
Spark_User_AutoClassification_attempt_Diagnostics: Livy session has failed. Session state: Dead, Error code: Spark_User_AutoClassification_attempt_Diagnostics. Job failed during run time with state=[dead]. Source: User.
Our setup is currently: Data pipeline, which invokes a notebook (using the notebook-activity), the notebook runs multiple other notebooks.
We have currently modified our data pipeline to run the notebook again if it fails. The second time it works perfectly.
We have checked our notebooks, none of them contain code to create nor stop a Sparksession. While googling found some Spark related solutions but none of them seem to apply to MS Fabric.
We are currently on runtime 1.2 and have tried 1.3, but it didn't fix this issue. But sadly, it doesn't.
Anyone else experiencing this issue, or has dealt with a similar situation before?
PS: It is really a bummer as it costs us a lot of capacity.
Solved! Go to Solution.
Hi @Riktastic ,
This error usually means the Spark session was not fully active when mssparkutils.notebook.runMultiple() was triggered.This might due to idle timeouts, especially when the pipeline runs after a period of inactivity.To reduce failures, you may try adding a small command like spark.range(1) at the start of the parent notebook to initialize the session, and if you are launching several notebooks at once, consider staggering them slightly. Also, reviewing your Spark pool’s min/max settings may help reduce session startup delays.
Please refer the links below for detailed infromation:
https://learn.microsoft.com/en-us/fabric/data-engineering/microsoft-spark-utilities
https://learn.microsoft.com/en-us/fabric/data-engineering/get-started-api-livy-session
I hope this resolve your query.If so,give us kudos and consider accepting it as solution.
Regards,
Pallavi.
Hi @Riktastic ,
I wanted to check in on your situation regarding the issue. Have you resolved it? If you have, please consider marking the reply that helped you or sharing your solution. It would be greatly appreciated by others in the community who may have the same question
Thank you
Hi @Riktastic ,
Following up to check whether you got a chance to review the suggestion given.If it helps,consider accepting it as solution,it will be helpful for other members of the community who have similar problems as yours to solve it faster. Glad to help.
Thank you.
Hi @Riktastic ,
I wanted to check and see if you had a chance to review our previous message or Please let me know if everything is sorted or if you need any further assistance.If it helps,consider accepting it as solution.
Thank you.
Hi @Riktastic ,
This error usually means the Spark session was not fully active when mssparkutils.notebook.runMultiple() was triggered.This might due to idle timeouts, especially when the pipeline runs after a period of inactivity.To reduce failures, you may try adding a small command like spark.range(1) at the start of the parent notebook to initialize the session, and if you are launching several notebooks at once, consider staggering them slightly. Also, reviewing your Spark pool’s min/max settings may help reduce session startup delays.
Please refer the links below for detailed infromation:
https://learn.microsoft.com/en-us/fabric/data-engineering/microsoft-spark-utilities
https://learn.microsoft.com/en-us/fabric/data-engineering/get-started-api-livy-session
I hope this resolve your query.If so,give us kudos and consider accepting it as solution.
Regards,
Pallavi.