March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi everyone,
I am looking to speed up a Power BI report that I have with a model that contains around 400k rows. I am pulling in the lowest level of data that my organization captures, but that level is not needed in most of the objects on the report. This is what I am wondering:
Will I see a performance boost if I also pull aggregate tables into my model and use those tables to connect to the charts that don't need row level detail - even if there are objects that do display row level data on the same page?
I don't have a great understanding on the sequence of actions that Power BI goes through to calculate the values that show up on each object e.g. if there are multiple objects on a page that connect to the same table, does power bi need to run through the table separately for each object or only once? If it is the latter I'd assume that aggregate tables wouldn't help.
Thank you.
Solved! Go to Solution.
It’s very hard to say without seeing it. Every relationship has a cost, and the cost is higher the higher the cardinality of the related columns. So you can improve things by removing high cardinality relationships. This could be done by consolidating fact tables.
Options I think worthy of consideration include
Removing columns in fact tables that are not used
Unpivoting columns in fact tables
Consolidating fact tables together
Eg
if you can have 1 fact table with columns
type, amount
target, 5
actual, 4
other fact, 3
etc, 8
then you can write
Total target = calculate(sum(table[amount]),table[type=“target”)
Total actual = calculate(sum(table[amount]),table[type=“Actual”)
Divide([total actual],[total target])
400k is a tiny model. It cannot be the number of row that is the issue. It is more likely the number of columns and/or the way you have written your formulas and/or your table structure.
How many tables?
how many columns in your main table?
what is your most complex formula that is slow?
Thanks for the reply. I have 25 tables in the model - 4 main fact tables that add up to about 400k rows, several dimension tables, and several complimentary tables with targets and such. The fact table with the most rows (200k) has about 24 columns.
The majority of the joins between these tables are done via int keys.
I do have a bunch of calculated measures that reference multiple tables in the model - could this be the problem?
For example:
% of target = DIVIDE(sum(Facts[Amount]),sum(Targets[Amount]))*100
Or maybe it's just the number of tables and I should somehow consolidate them?
Thanks again.
It’s very hard to say without seeing it. Every relationship has a cost, and the cost is higher the higher the cardinality of the related columns. So you can improve things by removing high cardinality relationships. This could be done by consolidating fact tables.
Options I think worthy of consideration include
Removing columns in fact tables that are not used
Unpivoting columns in fact tables
Consolidating fact tables together
Eg
if you can have 1 fact table with columns
type, amount
target, 5
actual, 4
other fact, 3
etc, 8
then you can write
Total target = calculate(sum(table[amount]),table[type=“target”)
Total actual = calculate(sum(table[amount]),table[type=“Actual”)
Divide([total actual],[total target])
My previous reponse was to your earlier post. Regarding visuals, yes - definitely this can slow it down. But it is not so much the number of visuals as the number of data points displayed. Eg if you have 200k rows and 24 columns and you are rendering all of those numbers, the rendering will take time. If you Instead summarise the numbers and render 10 data points, it will be fast. It is easy to test. Just create a new page, add a single card with a single measure. Add a slicer and see how long it takes to slice
As MattAllington already pointed out 400k is a very small model. It shouln't cause a performance issue. Do you have specific charts/Tables which are slow? If yes, what measures you use in that visual?
I used DAX studio to trace my model and the longest query is only 47 milliseconds. This is making me think that the cause of the slow down is having too many visuals on one page, and not the model or any measures. Currently I have this on one page:
10 Cards
3 gauges
2 bar charts
2 pie charts
2 bar charts
2 custom visuals
Is it possible that too many visuals is the cause here?
Thanks.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
124 | |
87 | |
85 | |
70 | |
51 |
User | Count |
---|---|
205 | |
153 | |
97 | |
79 | |
69 |