Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Currently python notebooks are subject to the same limits as spark notebooks
Concurrency limits and queueing in Apache Spark for Fabric - Microsoft Fabric | Microsoft Learn
From my experiance this means that on an F2 you can not run more than 4 notebook sessions at once via a pipeline. This limit is fine for spark given the higher node size floor meaning that four sessions will likley consume 100% of an F2. However, as python notebooks run by default with 2 vCores you can run far more sessions at once for the same capacity. This limits how many jobs can run at once increasing ETL durations.
Ideally the queue limit for python notebooks should be double the spark notebook queue. E.g. 8 on an F2, or this could be a blended limit, e.g 2 spark notebooks and 4 python notebooks.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.