The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Is there a way for an admin to limit the resources available to be consumed by a workspace or dataflow? We had a dev workspace in our premium capacity take down the entire capacity because it consumed too much resources, then we had to wait for the carryover from throttling to get functionality back for an entire hour and a half.
Any way to manage this OTHER than auto scale?
The only thing I can think of is to make sure to allocate how much memory dataflows can consume in your premium capacity. So that a dataflow cannot consume all the memory?