Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
mjmar025
Helper I
Helper I

Model Performance Enhancements?

Hi everyone,

 

I am looking to speed up a Power BI report that I have with a model that contains around 400k rows. I am pulling in the lowest level of data that my organization captures, but that level is not needed in most of the objects on the report. This is what I am wondering:

 

Will I see a performance boost if I also pull aggregate tables into my model and use those tables to connect to the charts that don't need row level detail - even if there are objects that do display row level data on the same page?

 

I don't have a great understanding on the sequence of actions that Power BI goes through to calculate the values that show up on each object e.g. if there are multiple objects on a page that connect to the same table, does power bi need to run through the table separately for each object or only once? If it is the latter I'd assume that aggregate tables wouldn't help.

 

Thank you.

1 ACCEPTED SOLUTION

It’s very hard to say without seeing it.  Every relationship has a cost, and the cost is higher the higher the cardinality of the related columns. So you can improve things by removing high cardinality relationships.  This could be done by consolidating fact tables. 

 

Options I think worthy of consideration include

Removing columns in fact tables that are not used

Unpivoting columns in fact tables 

Consolidating fact tables together

 

Eg

 

if you can have 1 fact table with columns

type, amount

target, 5

actual, 4

other fact, 3

etc, 8

 

then you can write 

Total target = calculate(sum(table[amount]),table[type=“target”)

Total actual = calculate(sum(table[amount]),table[type=“Actual”)

Divide([total actual],[total target])

 



* Matt is an 8 times Microsoft MVP (Power BI) and author of the Power BI Book Supercharge Power BI.

View solution in original post

6 REPLIES 6

400k is a tiny model. It cannot be the number of row that is the issue. It is more likely the number of columns and/or the way you have written your formulas and/or your table structure.  

 

How many tables?

how many columns in your main table?

what is your most complex formula that is slow?



* Matt is an 8 times Microsoft MVP (Power BI) and author of the Power BI Book Supercharge Power BI.

Thanks for the reply. I have 25 tables in the model - 4 main fact tables that add up to about 400k rows, several dimension tables, and several complimentary tables with targets and such. The fact table with the most rows (200k) has about 24 columns.

 

The majority of the joins between these tables are done via int keys.

 

I do have a bunch of calculated measures that reference multiple tables in the model - could this be the problem?

For example:

% of target = DIVIDE(sum(Facts[Amount]),sum(Targets[Amount]))*100

 

Or maybe it's just the number of tables and I should somehow consolidate them?

 

Thanks again.

It’s very hard to say without seeing it.  Every relationship has a cost, and the cost is higher the higher the cardinality of the related columns. So you can improve things by removing high cardinality relationships.  This could be done by consolidating fact tables. 

 

Options I think worthy of consideration include

Removing columns in fact tables that are not used

Unpivoting columns in fact tables 

Consolidating fact tables together

 

Eg

 

if you can have 1 fact table with columns

type, amount

target, 5

actual, 4

other fact, 3

etc, 8

 

then you can write 

Total target = calculate(sum(table[amount]),table[type=“target”)

Total actual = calculate(sum(table[amount]),table[type=“Actual”)

Divide([total actual],[total target])

 



* Matt is an 8 times Microsoft MVP (Power BI) and author of the Power BI Book Supercharge Power BI.

My previous reponse was to your earlier post. Regarding visuals, yes - definitely this can slow it down. But it is not so much the number of visuals as the number of data points displayed. Eg if you have 200k rows and 24 columns and you are rendering all of those numbers, the rendering will take time. If you Instead summarise the numbers and render 10 data points, it will be fast. It is easy to test. Just create a new page, add a single card with a single measure. Add a slicer and see how long it takes to slice 



* Matt is an 8 times Microsoft MVP (Power BI) and author of the Power BI Book Supercharge Power BI.

As MattAllington already pointed out 400k is a very small model. It shouln't cause a performance issue. Do you have specific charts/Tables which are slow? If yes, what measures you use in that visual?

I used DAX studio to trace my model and the longest query is only 47 milliseconds. This is making me think that the cause of the slow down is having too many visuals on one page, and not the model or any measures. Currently I have this on one page:

 

10 Cards

3 gauges

2 bar charts

2 pie charts

2 bar charts

2 custom visuals

 

Is it possible that too many visuals is the cause here?

 

Thanks.

 

Helpful resources

Announcements
Sept PBI Carousel

Power BI Monthly Update - September 2024

Check out the September 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Sept NL Carousel

Fabric Community Update - September 2024

Find out what's new and trending in the Fabric Community.

Top Kudoed Authors