Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Python notebooks should have higher concurrency limits than spark notebooks

Currently python notebooks are subject to the same limits as spark notebooks

Concurrency limits and queueing in Apache Spark for Fabric - Microsoft Fabric | Microsoft Learn

 

From my experiance this means that on an F2 you can not run more than 4 notebook sessions at once via a pipeline. This limit is fine for spark given the higher node size floor meaning that four sessions will likley consume 100% of an F2. However, as python notebooks run by default with 2 vCores you can run far more sessions at once for the same capacity. This limits how many jobs can run at once increasing ETL durations.

 

Ideally the queue limit for python notebooks should be double the spark notebook queue. E.g. 8 on an F2, or this could be a blended limit, e.g 2 spark notebooks and 4 python notebooks.

Status: New