Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Is there an easy way of getting over time the number of nodes allocated for Spark jobs/notebooks?
We can obviously report on individual jobs that are running (Monitor) and the number of CUs in use over time (Capacity Metrics), but one issue we're having is spark jobs grabbing a whole bunch of nodes and then causing the 'not enough nodes' error for other users. I'd like to both monitor node allocations as well maybe trigger alerts when we get close to the limit.
I'm already having to go on the hunt for rogue users - I'd like to catch it before it happens.
Hi @spencer_sa,
Perhaps you can try to use real-time intelligence eventsteam with event or alert to notice based on current idle spark nodes.
Set alerts on Fabric workspace item events in Real-Time hub - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng