Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
I was able to get the events from Azure eventhub using spark structured streaming. I am using fabric notebook for that. After strating the writeStream, I forgot to stop the streaming (query.stop()) and killed the session. IN the gabric notebook when I run spark.streams.active it shows 0 count but in the spark UI the status still shows RUNNING for few streams. How can I stop those in spark UI ? I nolonger have access those streams to stop. Will it automatically stop ? If it keeps tunning does it incur cost to me ? Is there any other alternative to stop those ?
I tried to restart the environment still it didn't work. I see to option to reset the capacity. Does nayone had the similar situation ?
Solved! Go to Solution.
Hello @Jay-RM
I see from your screenshot above that you have Spark Application Ids of the running apps. You can call Fabric REST API endpoint to cancel the running Spark apps - detail below;
https://api.fabric.microsoft.com/v1/workspaces/{workspace_id}/notebooks/{notebook_id}/livySessions/{livy_id}/applications/{app_id}/cancel
You'll find the notebook_id and workspace_id on the browser URL of your notebook. If you don't have the Livy Session Id (livy_id) required above, you can get it via this REST endpoint.
https://api.fabric.microsoft.com/v1/workspaces/{workspace_id}/notebooks/{notebook_id}/livySessions
Hi @Jay-RM,
Checking in to see if your issue has been resolved. let us know if you still need any assistance.
Thank you.
Hi @Jay-RM,
Have you had a chance to review the solution we shared by @suparnababu8 @deborshi_nag? If the issue persists, feel free to reply so we can help further.
Thank you.
Hello @Jay-RM
I see from your screenshot above that you have Spark Application Ids of the running apps. You can call Fabric REST API endpoint to cancel the running Spark apps - detail below;
https://api.fabric.microsoft.com/v1/workspaces/{workspace_id}/notebooks/{notebook_id}/livySessions/{livy_id}/applications/{app_id}/cancel
You'll find the notebook_id and workspace_id on the browser URL of your notebook. If you don't have the Livy Session Id (livy_id) required above, you can get it via this REST endpoint.
https://api.fabric.microsoft.com/v1/workspaces/{workspace_id}/notebooks/{notebook_id}/livySessions
Hi @Jay-RM
You can restarting the Sparkpool should recycle the Sparkruntime and kill orphaned quries. If already tried to notebook restart then try restarting entire spark pool. While coming to the billing, if it's running it incurres the cost.
Please let me know if it helps you.
Thank you!!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 30 | |
| 22 | |
| 14 | |
| 12 | |
| 11 |