Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hello,
I am executing a long running Spark appliation from a notebook. In SparkUI I see that not all possible resources are used, so I would like to repartition the data to the maximum available cores. However, I don't know how to programmatically obtain the maximum available number of executors.
Please advise.
Solved! Go to Solution.
HI @sparkie,
For use rest api to getting spark settings, you can refer to the following document:
Workspace Settings - Get Spark Settings - REST API (Spark) | Microsoft Learn
If the above not suitable for your requirement, you can also check the spark computer and pool setting at the 'Workspace settings':
Create custom Apache Spark pools in Fabric - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
HI @sparkie,
For use rest api to getting spark settings, you can refer to the following document:
Workspace Settings - Get Spark Settings - REST API (Spark) | Microsoft Learn
If the above not suitable for your requirement, you can also check the spark computer and pool setting at the 'Workspace settings':
Create custom Apache Spark pools in Fabric - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!