The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Is there a way for an admin to limit the resources available to be consumed by a workspace or dataflow? We had a dev workspace in our premium capacity take down the entire capacity because it consumed too much resources, then we had to wait for the carryover from throttling to get functionality back for an entire hour and a half.
Any way to manage this OTHER than auto scale?
The only thing I can think of is to make sure to allocate how much memory dataflows can consume in your premium capacity. So that a dataflow cannot consume all the memory?