Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
JosueMolina
Helper III
Helper III

Notebooks stuck in Starting state in Data Pipelines

Has anybody been experiencing an issue where Notebooks get stuck in Starting state when looking at a Data Pipeline Run, but once you enter the Spark Application for that Notebook run, it shows as either Stopped at 20 minutes or failed at 2 or 3 minutes. 

I believe the Stopped or Failed status comes from too many concurrent notebooks (I was under the impression the notebooks would be Queued but that's a separate issue). But my main problem is that the Pipeline is still showing those Notebook activities as In Progress (with the Cluster showing as Starting), so any Notebook activities that follow will never run. I can solve this by setting a Time Out for those activities, but I assume this is a Pipeline bug since this behavior was not common until recently. 

7 REPLIES 7
JhonatanD
New Member

I managed to solve mine to not be stuck, deactivating this option

 

JhonatanD_0-1733834711898.png

 

Right, but I actually need High Concurrency turned on as we depend on this to run a set of small notebooks on parallel. 

Anonymous
Not applicable

Hi @JosueMolina ,

Can you please share some more detail information about this issue? They should help us clarify your scenario and test to troubleshoot.

Regards,

Xiaoxin Sheng

Hi, this is usually happening on times where we have multiple notebooks running (big team with regular batch jobs)

This is an example where you can see the Activity has been in progress for 3 minutes:

JosueMolina_0-1734707351285.png

but clicking on the Notebook activity will open up this:
 

JosueMolina_1-1734707391481.png

You see the Notebook is still in the Starting state. If I leave it be, it will stay like this for hours, never executing anything.

If I try to look for the Notebook snapshot, it will show there is no job, the job has failed or the job was stopped, despite the Pipeline still showing that Notebook as In Progress. This does seem to be happening for Notebooks in Workspaces with High Concurrency in Pipelines turned on, but I haven't confirmed if this is exclusive to that or just more common. This defeats the purpose of that feature.

Got a perfect example today. Mind you, we get multiple notebooks with this same scenario. 

Stuck In Progress within Pipeline.

JosueMolina_0-1734716024462.png

Activity Details show it's still Starting 

JosueMolina_1-1734716076649.png

Spark Monitoring shows the Spark Job actually timed out at 20 minutes

JosueMolina_2-1734716113177.png


This is all for the same Notebook run within a Pipeline.

What sort of detail?

As mentioned, a notebook activity will show as In Progress inside a Data Pipeline but viewing the specific Spark application will show it is either failed or stopped. I believe this happens because there are too many notebook sessions running at once, though I believe extra Notebooks are meant to be queued and not stuck or failed.

I am running this on an F16 capacity as part of our data pipeline.

JhonatanD
New Member

My workspaces look like this, too

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.