Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
spencer_sa
Frequent Visitor

[TooManyRequestsForCapacity] issue with multiple notebook users on an F128 Fabric instance

We're getting the following issue with 16 concurrent users trying to either "Load into tables" in a Fabric lakehouse or executing PySpark/SparkSQL notebooks.

InvalidHttpRequestToLivy: [TooManyRequestsForCapacity] This spark job can't be run because you have hit a spark compute or API rate limit. To run this spark job, cancel an active Spark job through the Monitoring hub, choose a larger capacity SKU, or try again later. HTTP status code: 430 {Learn more} HTTP status code: 430.

Of the 16 users, around 6 are having this error appear and preventing them from running notebooks/load jobs.  What am I missing?  I've done the following to no avail.

We are operating an F128 capacity instance and have a workspace per pair of users (so 8 workspaces in total)
I've tried creating new Spark pools (Large) for 3 of the workspaces with issues.
I've tried changing the cores available for each workspace (from 4 up to 16)
I've tried to get individuals to close down multiple notebooks in case they had them open
There are never more than 10 items "Progressing" in the Monitoring Hub.

4 REPLIES 4
12angrymentiger
Advocate III
Advocate III

We fixed the concurrent notebook issue by using the runMultiple method like the example below.

You create a DAG with the order and dependencies when calling the notebooks. They all end up using the same spark compute and environment and the concurrency issue is mitigated in the backend.

 

mssparkutils.notebook.runMultiple(DAG, {"p1":True, "p2":"spectral", "p3":11})

 

 

runMultiple , on the other hand, allows you to create a Direct Acyclic Graph (DAG) of notebooks to execute notebooks in parallel and in specified order, similar to a pipeline run except in a notebook. The advantages here are that you can:

  • programmatically define the execution order

  • run notebooks in parallel or in squence

  • define dependencies between notebook runs

  • efficient use of the compute resources as it will use the compute of the orchestrating notebook

  • Depending on the F SKU and the node size, you may get TooManyRequestsForCapacity error if you run multiple notebooks at the same time because of concurrency limits. Since runMultiple uses the same compute, you can mitigate this error. Advancing Analytics has an excellent blog that goes in-depth into this. My colleague Will Crayger also has also researched this topic and summarized his findings here.

12angrymentiger_0-1709843927524.png

 

 

Using runMultiple To Orchastrate Notebook Execution in Microsoft Fabric

Thanks for replying.  I have seen this blog post.  The issue we were having wasn't that we had 1 'user'/process wanting to execute multiple notebooks, but multiple people each wanting to run notebooks in different workspaces.  We were running a 'learn Fabric' workshop.

v-cboorla-msft
Community Support
Community Support

Hi @spencer_sa 

 

Thanks for using Microsoft Fabric community.

Apologies for the issue you have been facing. 

It's difficult to tell what could be the reason for this performance. 

Please reach out to our support team so they can do a more thorough investigation on why this it is happening. Our support engineers will definitely would understand the route cause and properly address it.

Please go ahead and raise a support ticket to reach our support team:

https://support.fabric.microsoft.com/support

After creating a Support ticket please provide the ticket number as it would help us to track for more information.

 

Thank you.

Thanks for replying.
Case 2403070050004917

Helpful resources

Announcements
FabricCarousel_June2024

Fabric Monthly Update - June 2024

Check out the June 2024 Fabric update to learn about new features.

July Newsletter

Fabric Community Update - July 2024

Find out what's new and trending in the Fabric Community.