Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hello,
I am executing a long running Spark appliation from a notebook. In SparkUI I see that not all possible resources are used, so I would like to repartition the data to the maximum available cores. However, I don't know how to programmatically obtain the maximum available number of executors.
Please advise.
Solved! Go to Solution.
HI @sparkie,
For use rest api to getting spark settings, you can refer to the following document:
Workspace Settings - Get Spark Settings - REST API (Spark) | Microsoft Learn
If the above not suitable for your requirement, you can also check the spark computer and pool setting at the 'Workspace settings':
Create custom Apache Spark pools in Fabric - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
HI @sparkie,
For use rest api to getting spark settings, you can refer to the following document:
Workspace Settings - Get Spark Settings - REST API (Spark) | Microsoft Learn
If the above not suitable for your requirement, you can also check the spark computer and pool setting at the 'Workspace settings':
Create custom Apache Spark pools in Fabric - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 12 | |
| 6 | |
| 5 | |
| 4 | |
| 4 |
| User | Count |
|---|---|
| 23 | |
| 22 | |
| 14 | |
| 12 | |
| 10 |