The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hello everyone,
I have a dataset where I have a value for each Consumer Code that is duplicated for each ID I have on my database. I want to return the unique Amount value for each Code and Each Month with a measure, so I can sum that one and get the total value by code over the selected dates by the user. For the below data, what I expect is a measure that when the user selects a date in the slicer (i.e. 20/10/2024), the values returned should be:
Code 287 an amount equal to (8000+19000) = 27000
Code 465 an amount equal to (12801.18+43373.8) =56174.98
So the results must be before the date the user selects and must be the average value of the AMOUNT column.
The code I'm using right now is something similar to this but it's not working as expected:
YearToDateAverage =
CALCULATE(
VAR PeriodAverages =
ADDCOLUMNS(
FILTER(
VALUES(DATA[DATE]),
DATA[DATE] <= MAX(DATA[DATE])
),
"PeriodAvg",
CALCULATE(
AVERAGEX(
FILTER(
DATA,
NOT(ISBLANK(DATA[AMOUNT]))
),
DATA[AMOUNT]
),
FILTER(
ALL(DATA),
DATA[CODE] = SELECTEDVALUE(DATA[CODE) &&
DATA[DATE] = EARLIER(DATA[DATE]) &&
YEAR(DATA[DATE]) = YEAR(MAX(DATA[DATE]))
)
)
)
VAR PeriodCount = COUNTROWS(PeriodAverages)
VAR OverallAverage = AVERAGEX(PeriodAverages, [PeriodAvg])
RETURN OverallAverage,
REMOVEFILTERS( --used to remove filters from the slicers
DATA[DATE],
DATA[YEAR],
DATA[DATE_MONTH]
)
)
Date | ID | AMOUNT | CODE |
05/09/2024 07:00 | 15446213 | 8000 | 287 |
06/09/2024 07:00 | 2425691 | 8000 | 287 |
06/09/2024 07:00 | 15923576 | 8000 | 287 |
06/09/2024 07:00 | 12443095 | 12801.18 | 465 |
06/09/2024 07:00 | 744922.5 | 12801.18 | 465 |
06/09/2024 07:00 | 9731083 | 12801.18 | 465 |
08/09/2024 07:00 | 3051668 | 12801.18 | 465 |
10/09/2024 07:00 | 2411277 | 12801.18 | 465 |
10/09/2024 07:00 | 8674816 | 12801.18 | 465 |
11/09/2024 07:00 | 2546245 | 8000 | 287 |
13/09/2024 07:00 | 10229042 | 12801.18 | 465 |
13/09/2024 07:00 | 15755668 | 12801.18 | 465 |
13/09/2024 07:00 | 2017874 | 12801.18 | 465 |
13/09/2024 07:00 | 3126478 | 12801.18 | 465 |
13/09/2024 07:00 | 7406693 | 12801.18 | 465 |
13/09/2024 07:00 | 8057878 | 12801.18 | 465 |
13/09/2024 07:00 | 10893955 | 12801.18 | 465 |
13/09/2024 07:00 | 4356376 | 12801.18 | 465 |
13/09/2024 07:00 | 724419 | 12801.18 | 465 |
13/09/2024 07:00 | 1375314 | 12801.18 | 465 |
13/09/2024 07:00 | 14852419 | 12801.18 | 465 |
13/09/2024 07:00 | 12256065 | 12801.18 | 465 |
13/09/2024 07:00 | 11124722 | 12801.18 | 465 |
18/09/2024 07:00 | 4904784 | 12801.18 | 465 |
19/09/2024 07:00 | 2101887 | 12801.18 | 465 |
19/09/2024 07:00 | 562446.2 | 12801.18 | 465 |
20/09/2024 07:00 | 10803079 | 12801.18 | 465 |
20/09/2024 07:00 | 8847007 | 12801.18 | 465 |
21/09/2024 07:00 | 983438.3 | 12801.18 | 465 |
24/09/2024 07:00 | 12598677 | 12801.18 | 465 |
24/09/2024 07:00 | 9603358 | 12801.18 | 465 |
24/09/2024 07:00 | 4641095 | 12801.18 | 465 |
25/09/2024 07:00 | 4010836 | 12801.18 | 465 |
26/09/2024 07:00 | 13825778 | 12801.18 | 465 |
26/09/2024 07:00 | 9984293 | 12801.18 | 465 |
26/09/2024 07:00 | 8226882 | 12801.18 | 465 |
26/09/2024 07:00 | 9005921 | 12801.18 | 465 |
01/10/2024 07:00 | 5589874 | 43373.8 | 465 |
01/10/2024 07:00 | 12890181 | 43373.8 | 465 |
01/10/2024 07:00 | 9040618 | 43373.8 | 465 |
01/10/2024 07:00 | 15628785 | 43373.8 | 465 |
01/10/2024 07:00 | 672593 | 19000 | 287 |
01/10/2024 07:00 | 9060820 | 19000 | 287 |
02/10/2024 07:00 | 13169360 | 19000 | 287 |
04/10/2024 07:00 | 10903322 | 43373.8 | 465 |
17/10/2024 07:00 | 2318524 | 43373.8 | 465 |
17/10/2024 07:00 | 13430471 | 43373.8 | 465 |
18/10/2024 07:00 | 3331089 | 43373.8 | 465 |
22/10/2024 07:00 | 5876561 | 43373.8 | 465 |
22/10/2024 07:00 | 6470726 | 43373.8 | 465 |
22/10/2024 07:00 | 7406241 | 43373.8 | 465 |
22/10/2024 07:00 | 946373.9 | 43373.8 | 465 |
23/10/2024 07:00 | 10169340 | 43373.8 | 465 |
27/10/2024 07:00 | 4730362 | 43373.8 | 465 |
29/10/2024 07:00 | 10436420 | 43373.8 | 465 |
31/10/2024 07:00 | 1555396 | 43373.8 | 465 |
31/10/2024 07:00 | 8163260 | 43373.8 | 465 |
31/10/2024 07:00 | 314187.3 | 43373.8 | 465 |
03/11/2024 07:00 | 15470032 | 20000 | 287 |
04/11/2024 08:00 | 12730697 | 48128.07 | 465 |
08/11/2024 08:00 | 5914064 | 48128.07 | 465 |
Solved! Go to Solution.
maybe you can try to create a new column to get the average first
Proud to be a Super User!
Thank you both! I used the approach to get the average first and just sum the values, and it works!
Hi @julsr ,
Try to modify you measure to something like this:
YearToDateAverage =
CALCULATE(
SUMX(
SUMMARIZE(
FILTER(
DATA,
DATA[DATE] <= MAX(DATA[DATE])
),
DATA[CODE],
"UniqueAmount", AVERAGEX(
FILTER(
DATA,
DATA[CODE] = EARLIER(DATA[CODE]) &&
DATA[DATE] <= MAX(DATA[DATE])
),
DATA[AMOUNT]
)
),
[UniqueAmount]
),
REMOVEFILTERS(
DATA[DATE],
DATA[YEAR],
DATA[DATE_MONTH]
)
)
Dont forget to replace table and column names with your owns.
Thank you
maybe you can try to create a new column to get the average first
Proud to be a Super User!
User | Count |
---|---|
87 | |
84 | |
36 | |
35 | |
30 |
User | Count |
---|---|
95 | |
74 | |
67 | |
52 | |
51 |