Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
DebbieE
Community Champion
Community Champion

This spark job can't be run because you have hit a spark compute or API rate limit

We are still just testing Fabric and because the trial is over, have moved to the most basic fabric capacity.

 

Im the only one using it. 

 

I run through a notebook and at the end I have a bit of code 

 

# Stop the Spark session
spark.stop()
 
Which stops the job and it should be the only job running (standard session)
 
I then go across to the next notebook and get 
 
InvalidHttpRequestToLivy: [TooManyRequestsForCapacity] This spark job can't be run because you have hit a spark compute or API rate limit. To run this spark job, cancel an active Spark job through the Monitoring hub, choose a larger capacity SKU, or try again later. HTTP status code: 430 {Learn more} HTTP status code: 430.
 
so even though i have cancelled the original job. I can't run the next notebook. If I click on Monitor icon i can see its still in Progress. So I stop it here and try again and it works.
 
Why didnt my code stop the job. it looked like it did in the notebook? I shouldnt have to manually exit from it surely?
1 ACCEPTED SOLUTION
frithjof_v
Super User
Super User

My impression is that using spark.stop() doesn't work in Fabric notebooks.

 

You could try to use mssparkutils.session.stop() instead

 

https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/microsoft-spark-utilities?pivots=pro...

 

Even if this documentation is for Azure Synapse Analytics, it seems to work, at least for the moment.

 

 

Here is the Notebookutils for Fabric, it doesn't mention this function.

 

https://learn.microsoft.com/en-us/fabric/data-engineering/notebook-utilities

 

 

You can also click the Stop button in the Notebook user interface.

 

If you want to programmatically run the next notebook after the first notebook has finished, you can use the notebookutils.notebook.run() function in Notebookutils.

 

https://learn.microsoft.com/en-us/fabric/data-engineering/notebook-utilities#reference-a-notebook

View solution in original post

8 REPLIES 8
infomod
Regular Visitor

Any update on this, how to stop a Spark session in MS Fabric notebook?

MartinFM
Helper I
Helper I

I get this error all the time. Nothing is running in Monitor but I still get the error. I have no idea what I have to stop so that I can be allowed to run my notebooks. 

 

I have the lowest capacity F2. We are also evaluating whether or not Fabric could be our future data platform. 

frithjof_v
Super User
Super User

My impression is that using spark.stop() doesn't work in Fabric notebooks.

 

You could try to use mssparkutils.session.stop() instead

 

https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/microsoft-spark-utilities?pivots=pro...

 

Even if this documentation is for Azure Synapse Analytics, it seems to work, at least for the moment.

 

 

Here is the Notebookutils for Fabric, it doesn't mention this function.

 

https://learn.microsoft.com/en-us/fabric/data-engineering/notebook-utilities

 

 

You can also click the Stop button in the Notebook user interface.

 

If you want to programmatically run the next notebook after the first notebook has finished, you can use the notebookutils.notebook.run() function in Notebookutils.

 

https://learn.microsoft.com/en-us/fabric/data-engineering/notebook-utilities#reference-a-notebook

I see this

notebookutils.session.stop()

in the NotebookUtils (former MSSparkUtils) for Fabric - Microsoft Fabric | Microsoft Learn

Seems to work.

Anonymous
Not applicable

Hi @DebbieE,

Any update on this? For the job queue limitation you can refer to following document:

Job queueing in Apache Spark for Fabric - Microsoft Fabric | Microsoft Learn

If these processing meet to the limitations, you can consider reducing the total queue amount or upgrade the capacity tier.

Regards,

Xiaoxin Sheng

Anonymous
Not applicable

HI @DebbieE,

As the error message mentions, the spark job can't run because this operation hit the computer or API rate limit.

What type of capacity are you use to host these processes? How did you configurate the spark environment and pool settings? Please share some more detail information to help us clarify your scenario and test to troubleshoot.

Reference links:

Configure and manage starter pools in Fabric Spark. - Microsoft Fabric | Microsoft Learn

Compute management in Fabric environments - Microsoft Fabric | Microsoft Learn

Regards,

Xiaoxin Sheng

As above. Its the lowest capacity. 

 

However. I have 

# Stop the Spark session
spark.stop() 
 
At the end of the notebook. And this should stop the spark session. So then I can go on and run the next notebook. It still isnt working. I have to go actoss to Monitor to stop it.
 
Thats the issue. Why isnt the Spark.stop() code working?
Anonymous
Not applicable

Hi @DebbieE,

You can also refer to the following link to trace the notebook usages:

Notebook contextual monitoring and debugging - Microsoft Fabric | Microsoft Learn

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.