Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
alfBI
Helper V
Helper V

Power BI models and DF1 with an exagerate CU Consumption

Dear colleagues,

 

After a deep asessment of the usage of our Fabric capacity we have discovered that a group of users are consuming what looks like and exgarate amount of resources. We have a F1 64 SKU capacity, and their models and dataflows (gen 1) are consuming a 15% and a 5% of our total daily capacity respectvely (that is nearly a quarter of all our daily capacity).

 

It is supposed that these models/df1 are only consuming already existing coporate gen1 dataflows (not included on the previous figures) and sharepoint sources (mainly excel).

Does it have any sense to consume this enormous amount of daily CU´s only processing data coming from already existing gen1 dataflows and excels on sharepoint?

 

We are planning to move existing coporate gen 1 dataflows to a fabric lakehouse.

 

Would this improve significantly the performance of models currently consuming gen1 dataflows or we should review deeply what has been done on their side?

 

Thanks

Regards,

3 ACCEPTED SOLUTIONS
ribisht17
Super User
Super User

Hi @alfBI 

 

Transitioning your corporate Gen 1 dataflows to a Fabric Lakehouse has the potential to enhance performance, though the extent of improvement will vary based on several key considerations.

 

a. https://community.fabric.microsoft.com/t5/Service/Gen-2-dataflows-extremely-slow-vs-Gen-1-dataflow/m... 

 

b. Differences between Dataflow Gen1 and Dataflow Gen2 - Microsoft Fabric | Microsoft Learn 

 

c. Dataflows Gen 2 vs Gen 1  - iLink Digital 

 

Regards,

Ritesh

Community Champion

Please mark the answer if helpful so that it can help others

Dance-Sing with Data Projects - YouTube

View solution in original post

v-nmadadi-msft
Community Support
Community Support

Hi @alfBI  ,
Thanks for reaching out to the Microsoft fabric community forum.

To address the high resource consumption, it's important to first audit the specific models and dataflows involved. This includes using the Fabric Capacity Metrics app to identify high cost refreshes and profiling the transformation steps in Power Query to assess dataset complexity. Alongside this, evaluate the refresh frequency and scheduling and coordinate refresh timings to prevent overlapping processes that could lead to capacity spikes.
Additionally verify if the dataflow with issue exceeds the limit of 50 tables as per Power Query Online limits . If more than 50 tables are needed, it is recommended to split the dataflow into multiple ones to avoid exceeding this limit.


If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thank you


View solution in original post

rohit1991
Super User
Super User

Hi @alfBI ,

It does seem unusual for Power BI models and Gen1 dataflows to consume such a significant portion of your daily capacity—especially if they are only referencing existing corporate Gen1 dataflows and Excel files hosted on SharePoint. While some CU (Capacity Unit) consumption is expected during refresh and transformation processes, usage at the level you describe (15% for models and 5% for DF1) suggests that there may be inefficiencies in how the data is being processed. It's quite possible that the queries or transformations are overly complex, poorly optimized, or unnecessarily duplicating work during refreshes. 

 

Moving your corporate Gen1 dataflows to a Fabric Lakehouse could indeed lead to performance improvements, as Lakehouse provides a more scalable and performance-optimized architecture for large datasets. However, it would still be wise to conduct a thorough review of the models and dataflows in question. Look for issues like repeated joins, excessive calculated columns, use of nested functions in Power Query, or inefficient data types. Optimizing these aspects could reduce CU consumption significantly, regardless of the backend. Transitioning to a Lakehouse is a strategic move, but ensuring best practices in model and dataflow design is equally critical to sustained performance gains.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

View solution in original post

5 REPLIES 5
SaiTejaTalasila
Super User
Super User

Hi @alfBI ,

 

Download refresh history(you can analyse last 3-4 refreshes) and do some analysis which table is taking more time, number of rows etc,. It gives more details to identify the root cause of the issue.

 

 

I hope this helps.

 

Thanks,

Sai Teja 

v-nmadadi-msft
Community Support
Community Support

Hi @alfBI 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

 

rohit1991
Super User
Super User

Hi @alfBI ,

It does seem unusual for Power BI models and Gen1 dataflows to consume such a significant portion of your daily capacity—especially if they are only referencing existing corporate Gen1 dataflows and Excel files hosted on SharePoint. While some CU (Capacity Unit) consumption is expected during refresh and transformation processes, usage at the level you describe (15% for models and 5% for DF1) suggests that there may be inefficiencies in how the data is being processed. It's quite possible that the queries or transformations are overly complex, poorly optimized, or unnecessarily duplicating work during refreshes. 

 

Moving your corporate Gen1 dataflows to a Fabric Lakehouse could indeed lead to performance improvements, as Lakehouse provides a more scalable and performance-optimized architecture for large datasets. However, it would still be wise to conduct a thorough review of the models and dataflows in question. Look for issues like repeated joins, excessive calculated columns, use of nested functions in Power Query, or inefficient data types. Optimizing these aspects could reduce CU consumption significantly, regardless of the backend. Transitioning to a Lakehouse is a strategic move, but ensuring best practices in model and dataflow design is equally critical to sustained performance gains.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!
v-nmadadi-msft
Community Support
Community Support

Hi @alfBI  ,
Thanks for reaching out to the Microsoft fabric community forum.

To address the high resource consumption, it's important to first audit the specific models and dataflows involved. This includes using the Fabric Capacity Metrics app to identify high cost refreshes and profiling the transformation steps in Power Query to assess dataset complexity. Alongside this, evaluate the refresh frequency and scheduling and coordinate refresh timings to prevent overlapping processes that could lead to capacity spikes.
Additionally verify if the dataflow with issue exceeds the limit of 50 tables as per Power Query Online limits . If more than 50 tables are needed, it is recommended to split the dataflow into multiple ones to avoid exceeding this limit.


If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thank you


ribisht17
Super User
Super User

Hi @alfBI 

 

Transitioning your corporate Gen 1 dataflows to a Fabric Lakehouse has the potential to enhance performance, though the extent of improvement will vary based on several key considerations.

 

a. https://community.fabric.microsoft.com/t5/Service/Gen-2-dataflows-extremely-slow-vs-Gen-1-dataflow/m... 

 

b. Differences between Dataflow Gen1 and Dataflow Gen2 - Microsoft Fabric | Microsoft Learn 

 

c. Dataflows Gen 2 vs Gen 1  - iLink Digital 

 

Regards,

Ritesh

Community Champion

Please mark the answer if helpful so that it can help others

Dance-Sing with Data Projects - YouTube

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.