Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Is there a way for an admin to limit the resources available to be consumed by a workspace or dataflow? We had a dev workspace in our premium capacity take down the entire capacity because it consumed too much resources, then we had to wait for the carryover from throttling to get functionality back for an entire hour and a half.
Any way to manage this OTHER than auto scale?
The only thing I can think of is to make sure to allocate how much memory dataflows can consume in your premium capacity. So that a dataflow cannot consume all the memory?
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!