Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Next up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now

Reply
akanane
Frequent Visitor

Resource Limitation Issue with Microsoft Fabric Pipeline

Hello,

 

I am currently working with Microsoft Fabric and I am encountering an issue related to resource limitations when running notebooks in a foreach loop.

 

From the documentation, I understood that job types are classified into two categories: interactive (notebook & lakehouse-based) and batch (spark job definitions). Given that we are using a notebook in the pipeline, it is considered as an interactive job and after a couple of iterations we reach the capacity limit:Response code 430: Unable to submit this request because all the available capacity is currently being used. The suggested solutions are to cancel a currently running job, increase the available capacity, or try again later.

 

On the other hand, for batch jobs, with queueing enabled, batch jobs are added to the queue and automatically retried when the capacity is freed up. But it seems we cannot trigger batch jobs via pipeline, only manually or by schedule.

One workaround is to go through the loop sequentially (doesn't work when invoking a child pipeline with a notebook activity) but i was wondering if there is a better way to do this.

 

Any help or guidance would be greatly appreciated.

Thank you,

 

Amnay

1 ACCEPTED SOLUTION
AndyDDC
Most Valuable Professional
Most Valuable Professional

Hi @akanane unfortunately there is no ability to execute notebooks in a high concurrency session via pipelines, the only way I've been able to solve your challenge is either:

  • Don't run the Notebook in a ForEach, but rather pass in values into a single Notebook run and then code a loop in the notebook itself.  E.G in the following I pass in a json array of folders and then loop through (sequentially)

AndyDDC_0-1698750180448.png

 

  • or utilise multi-threading in the notenook to run code in parallel

AndyDDC_1-1698750292465.png

 

You could also have a read through this post by Lilliam 

Microsoft Fabric changing the game: Exporting data and building the Lakehouse | Microsoft Fabric Blo...

 

View solution in original post

6 REPLIES 6
akanane
Frequent Visitor

Hello again,

 

I appreciate your answers @AndyDDC , @Anonymous as it represents a valid workaround to achieve the goal. However, loading/transforming is considered a batch job so according to the documentation, it shouldn't be done with notebook as they're interactive and not designed for batch job.

What should be done in production though? Will we be able to trigger spark definition job from pipelines in the future ?

Anonymous
Not applicable

Hi @akanane ,

We can also use notebooks as a batch job based on our requirment as they can be used for both (interactive and batch operations).

At present we cannot trigger spark definition job from pipelines.

Appreciate if you could share the feedback on our feedback channel. Which would be open for the user community to upvote & comment on. This allows our product teams to effectively prioritize your request against our existing feature backlog and gives insight into the potential impact of implementing the suggested feature.

In mean while, I will try to check with team whether these activities are already in internal roadmap or not.

 

Hope this helps. Please let me know if you have any further queries.

AndyDDC
Most Valuable Professional
Most Valuable Professional

Hi @akanane unfortunately there is no ability to execute notebooks in a high concurrency session via pipelines, the only way I've been able to solve your challenge is either:

  • Don't run the Notebook in a ForEach, but rather pass in values into a single Notebook run and then code a loop in the notebook itself.  E.G in the following I pass in a json array of folders and then loop through (sequentially)

AndyDDC_0-1698750180448.png

 

  • or utilise multi-threading in the notenook to run code in parallel

AndyDDC_1-1698750292465.png

 

You could also have a read through this post by Lilliam 

Microsoft Fabric changing the game: Exporting data and building the Lakehouse | Microsoft Fabric Blo...

 

Hi @AndyDDC, I had those two options in mind but the notebook becomes a "black-box" and requires additional development effort of error handling, logs etc...

Anyway, thank you for your insight! I will definitely checkout the post 🙂

Have a good day 🙂

Anonymous
Not applicable

Hi @akanane ,

Thanks for using Fabric Capacity.

Notebooks are considered as interactive jobs, and there is a limit on the number of interactive jobs that can run simultaneously. When you run a foreach loop in a notebook, each iteration of the loop is treated as a separate interactive job. If the number of iterations exceeds the capacity limit, you will receive the error message you described.

There are a few workarounds that you can use:

  • Increase the available capacity. 
  • Run the foreach loop sequentially. This will reduce the number of interactive jobs that are running simultaneously, but it will also make the pipeline slower.
    vgchennamsft_0-1698773561174.png

     

Hope this is helpful.

Anonymous
Not applicable

Hi @akanane ,

Glad to know you got some insights. Please continue using Fabric Community incase of any queries.

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.