Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Currently python notebooks are subject to the same limits as spark notebooks
Concurrency limits and queueing in Apache Spark for Fabric - Microsoft Fabric | Microsoft Learn
From my experiance this means that on an F2 you can not run more than 4 notebook sessions at once via a pipeline. This limit is fine for spark given the higher node size floor meaning that four sessions will likley consume 100% of an F2. However, as python notebooks run by default with 2 vCores you can run far more sessions at once for the same capacity. This limits how many jobs can run at once increasing ETL durations.
Ideally the queue limit for python notebooks should be double the spark notebook queue. E.g. 8 on an F2, or this could be a blended limit, e.g 2 spark notebooks and 4 python notebooks.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.