Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
Hello community,
Here is some sample data:
id | tripid | Value | Value2 |
1 | 1 | 2,1 | 5 |
2 | 1 | 2,1 | 7 |
3 | 1 | 2,1 | 3 |
1 | 2 | 3,7 | 2 |
2 | 2 | 3,7 | 6 |
3 | 2 | 3,7 | 8 |
1 | 3 | 1,1 | 4 |
2 | 3 | 1,1 | 2 |
3 | 3 | 1,1 | 5 |
4 | 3 | 1,1 | 8 |
1 | 4 | 0,7 | 9 |
2 | 4 | 0,7 | 4 |
3 | 4 | 0,7 | 32 |
1 | 5 | 0,9 | 4 |
2 | 5 | 0,9 | 65 |
I am trying to sum all of the values in the [Value] column but only once for every TripId. My current measure looks like this:
SUMX(VALUES('Table'[TripId]), CALCULATE(MAX('Table'[Value])))
I have been getting some performance issues with this measure unfortuately. I've tried doing some basic tests where I just put Year, Month and the measure in a visual
Period | MS | Rows in table in selection |
Single year | 650 | 5347756 |
Two years | 1487 | 12138754 |
Three years | Visual has exceeded the available ressources | 19318853 |
I get how iterating over the table is bad for performance vs a simple sum, but I have so few rows to work with (~33 m rows total) so it just feels like I must be missing something obvious since I am getting such bad performance from a simple measure.
Are there any tricks that I can use or a different DAX pattern perhaps? My only idea is adding aggregation tables, but it just seems excessive for something so basic and with so small data volumes
Note that my calendar dimension is filtering this fact table through a date key and has a one-to-many relationship.
If I do a simple EVALUATE on just the measure, grouped by nothing, the measure runs in 1 MS for all of my data in DAX Studios.
The following query in DAX Studios runs in 2900 MS:
EVALUATE
CALCULATETABLE(
ADDCOLUMNS(
VALUES( 'Calendar'[Year] ),
"Value", [Measure]
)
)
2900 MS is obviously a very slow measure, but I am surprised that it cannot run at all, not even locally in desktop.
Hi @BIanon ,
Based on the information you've provided, the performance bottleneck seems to be related to the way the metrics are calculated, try changing sumx to something like calculate(sum(), filter()) to see if that improves the speed. Here's a related document that will hopefully help you: DAX Performance issue: why is my SUMX() so slow? - Stack Overflow.
In addition, performance is also related to filters, filters, and data models, which can be found in the documentation:Optimization guide for Power BI - Power BI | Microsoft Learn.
Also, although you mentioned that adding an aggregate table seems a bit much, for measurements that aggregate large amounts of data, adding an aggregate table can dramatically improve performance, and it's not a bad way to go .
If your Current Period does not refer to this, please clarify in a follow-up reply.
Best Regards,
Clara Gong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
@BIanon , The measures seems fine, but try this
SUMX(Summarize('Table','Table'[TripId], "_1",MAX('Table'[Value])), [_1])
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
87 | |
85 | |
82 | |
66 | |
49 |
User | Count |
---|---|
137 | |
111 | |
101 | |
66 | |
65 |