Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
CGrunberg
Advocate I
Advocate I

Fabric dataflow gen2: Issues with exceeding capacity units and being throttled.

Hi,


My company is currently testing out a simple datawarehouse setup in Microsoft Fabric using Blobs to store data in the bronze and silver layer and a Fabric warehouse for the gold layer.

In the intitial tests of the setup, we are only working with ~10 tables of relatively small size. However, even with a low number of tables without a lot of data, we are running into issues with exceeding the available resources in our F4 SKU setup. When the background % exceeds 100%, it results in the subsequent tasks being heavily throttled/smoothed or even blocked. This means I cannot even open the datawarehouse or work on other stuff in the same capacity while waiting, sometimes hours, for the throttling to complete.

 

I have installed the Fabric Capacity Metrics app and can see that the activity that exceeds our limit is a dataflow gen2. In our Power BI Pro setup, we have A LOT more data without ever running into issues.

 

I need to find a way to workaround this throttling to even be able to test and create a new setup, therefore, I want to know if anyone can help me with the following questions:

  • How does a Power BI Pro workspace compare to an F4 SKU?
  • Is there any way to limit how many capacity units a dataflow gen2 is allowed to use?
  • Is there any way to increase throttling/smoothing so that it also slows down currently running inquiries and never/rarely exceeds the 100% cap?
  • Are there any best practices I might be missing? I've read somewhere that it might help if the dataflow stores the transformed data in a datalake and then I can use a simple copy activity to copy from datalake to data warehouse?

 

Thanks in advance.

2 REPLIES 2
HimanshuS-msft
Microsoft Employee
Microsoft Employee

Hello @CGrunberg 
Thanks for using the Fabric community.

Let me help you with what I know 🙂 . We do have a document which you can use : https://learn.microsoft.com/en-us/fabric/data-warehouse/guidelines-warehouse-performance

You are correct as Copy activity does gives us the maximum thoughput .
https://learn.microsoft.com/en-us/fabric/data-warehouse/ingest-data#decide-which-data-ingestion-tool...

  • Use the COPY (Transact-SQL) statement for code-rich data ingestion operations, for the highest data ingestion throughput possible, or when you need to add data ingestion as part of a Transact-SQL logic. For syntax, see COPY INTO (Transact-SQL).


Thanks
HImanshu

Hi @HimanshuS-msft , thanks for your reply. It is helpful to know how to optimize the speed of the warehouse. Would you happen to know anything about the management of capacity units? No matter how much I optimize the warehouse it doesn't matter a lot if I keep getting locked out of it for 24 hours 😅

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors