Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hello,
I am executing a long running Spark appliation from a notebook. In SparkUI I see that not all possible resources are used, so I would like to repartition the data to the maximum available cores. However, I don't know how to programmatically obtain the maximum available number of executors.
Please advise.
Solved! Go to Solution.
HI @sparkie,
For use rest api to getting spark settings, you can refer to the following document:
Workspace Settings - Get Spark Settings - REST API (Spark) | Microsoft Learn
If the above not suitable for your requirement, you can also check the spark computer and pool setting at the 'Workspace settings':
Create custom Apache Spark pools in Fabric - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
HI @sparkie,
For use rest api to getting spark settings, you can refer to the following document:
Workspace Settings - Get Spark Settings - REST API (Spark) | Microsoft Learn
If the above not suitable for your requirement, you can also check the spark computer and pool setting at the 'Workspace settings':
Create custom Apache Spark pools in Fabric - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
User | Count |
---|---|
29 | |
22 | |
11 | |
7 | |
7 |