Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hello, all
I am trying to build a report that tracks forecasting error in sales.
My ultimate goal here is to appy the WMAPE/WAPE concept, which would be to divide the weighted sum of all absolute errors by the weighted sum of all actual sold quantities, such as:
So far, I have managed to structure my data in star-schema like this:
1.1. Blue is the fact table
1.2. Dim_Red is a dimension table for product portfolio
1.3. Dim_Mercado is a dimension table for geography
1.4. Dim_Ranking is a dimension table for the weights in the WMAPE calculation, specifying each month's weight in the error calculation (i.e., the forecast made during the month right before the actual sale weights 12, and the forecast made 6 months prior to the sale weights 4)
1.5. Dim_Gabarito is a dimension table to store the acual sales for the forecasted month
Now, I need to calculate the WMAPE. THe thing is: it is different according to the segmentation.
If I create a DAX measure to calculate the absolute error, it will dinamically adapt to the visual according to the granularity I have selected, but it will not aggregate correctly because it will always calculate at the fact table's maximum granularity.
So this is what I did next: I have created calculated tables for each of the main segmentations the user can select. Below is the example for Family (sorry, couldn't get the Dax Formatting to work):
Família = SUMMARIZECOLUMNS(
'12MPP'[Data Rodada],
'12MPP'[Data Rodada_YM],
'12MPP'[Data SAT_YM],
'Dim_Veículo'[Família],
"Quantidade", SUM('12MPP'[Quantidade]),
"Gabarito", SUM(Dim_Gabarito[Quantidade]),
"Ranking", AVERAGE(Dim_Ranking[Ranking]),
"Peso", AVERAGE(Dim_Ranking[Peso]),
"Erro ABS", ABS(SUMX(Dim_Gabarito,Dim_Gabarito[Quantidade])-SUMX('12MPP','12MPP'[Quantidade]))
)
From here, I can calculate the WMAPE just fine, using a measure:
Hi @MSpedaletti ,
If I understand correctly, the issue is that you want to calculate WMAPE. Please try the following methods and check if they can solve your problem:
1.Try to dynamically calculate the WAPE, create the measure that use the calculate function and the sumx function.
WAPE =
VAR TotalWeightedError =
SUMX(
'FactTable',
'FactTable'[Error] * RELATED('Dim_Ranking'[Weight])
)
VAR TotalWeightedActual =
SUMX(
'FactTable',
'FactTable'[Actual] * RELATED('Dim_Ranking'[Weight])
)
RETURN
DIVIDE(TotalWeightedError, TotalWeightedActual)
2.For the second question, calculation groups could be used to simplify the DAX measures. Calculation groups allow you to create a single set of calculations that can be reused across multiple measures.
Create calculation groups in Power BI (preview) - Power BI | Microsoft Learn
Best Regards,
Wisdom Wu
Hello, Wisdom Wu
Thank you for your reply.
I have tried the measure you suggested, but the issue persists.
The reason is that the error has to be calculated according to the selected granularity. For instance, if I have the following data:
Family | Item | Forecast | Actual | Absolute error |
A | Y | 10 | 8 | 2 |
A | X | 5 | 9 | 4 |
If I apply the suggested measure, this is the result I get when looking at the Family granularity level:
Family | Forecast | Actual | Absolute error |
A | 15 | 17 | 6 |
The measure calculates for each Item and then aggregates as a sum for the Family.
When what I need is to evaluate the error directly on the Family level:
Family | Forecast | Actual | Absolute error |
A | 15 | 17 | 2 |
Is there a way (different then using calculated tables) to get the WAPE measure to work for different granularities?
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
117 | |
84 | |
49 | |
38 | |
28 |
User | Count |
---|---|
185 | |
73 | |
73 | |
50 | |
42 |