The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
I have problem with calculating in power Bi balanced accuracy. I would like to see balnaced accuracy for every class and overall without dividing on classes. I have calumns like Actuals, Prediction and CorrectPrediction. Tried to have one metric. Important: Number of classes is dynamic. Could be 3 as well as 5.
From my data when I will create matrix with Prediction in columns and Actuals in Rows I am getting this :
PREDICTION | |||||||
A | B | C | D | E | sum from row | ||
A | 382 | 172 | 17 | 5 | 386 | 962 | |
B | 178 | 292 | 54 | 14 | 959 | 1497 | |
TRUE | C | 64 | 161 | 64 | 11 | 1696 | 1996 |
D | 40 | 84 | 41 | 13 | 2198 | 2376 | |
E | 67 | 197 | 58 | 20 | 25974 | 26316 | |
sum from column | - | 731 | 906 | 234 | 63 | 31213 |
I should get these calcualtions:
but i am getting completely wrong results.
Can you help me with appriopriate DAX command?
Solved! Go to Solution.
@prusik369 Create Measures for True Positives (TP), False Negatives (FN), False Positives (FP), and True Negatives (TN) for each class:
DAX
TP_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "A" && 'Table'[Prediction] = "A")
FN_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "A" && 'Table'[Prediction] <> "A")
FP_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "A" && 'Table'[Prediction] = "A")
TN_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "A" && 'Table'[Prediction] <> "A")
TP_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "B" && 'Table'[Prediction] = "B")
FN_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "B" && 'Table'[Prediction] <> "B")
FP_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "B" && 'Table'[Prediction] = "B")
TN_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "B" && 'Table'[Prediction] <> "B")
TP_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "C" && 'Table'[Prediction] = "C")
FN_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "C" && 'Table'[Prediction] <> "C")
FP_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "C" && 'Table'[Prediction] = "C")
TN_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "C" && 'Table'[Prediction] <> "C")
TP_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "D" && 'Table'[Prediction] = "D")
FN_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "D" && 'Table'[Prediction] <> "D")
FP_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "D" && 'Table'[Prediction] = "D")
TN_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "D" && 'Table'[Prediction] <> "D")
TP_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "E" && 'Table'[Prediction] = "E")
FN_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "E" && 'Table'[Prediction] <> "E")
FP_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "E" && 'Table'[Prediction] = "E")
TN_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "E" && 'Table'[Prediction] <> "E")
Create Measures for Sensitivity and Specificity for each class:
DAX
Sensitivity_A = DIVIDE([TP_A], [TP_A] + [FN_A])
Specificity_A = DIVIDE([TN_A], [TN_A] + [FP_A])
Sensitivity_B = DIVIDE([TP_B], [TP_B] + [FN_B])
Specificity_B = DIVIDE([TN_B], [TN_B] + [FP_B])
Sensitivity_C = DIVIDE([TP_C], [TP_C] + [FN_C])
Specificity_C = DIVIDE([TN_C], [TN_C] + [FP_C])
Sensitivity_D = DIVIDE([TP_D], [TP_D] + [FN_D])
Specificity_D = DIVIDE([TN_D], [TN_D] + [FP_D])
Sensitivity_E = DIVIDE([TP_E], [TP_E] + [FN_E])
Specificity_E = DIVIDE([TN_E], [TN_E] + [FP_E])
Create Measures for Balanced Accuracy for each class:
DAX
BalancedAccuracy_A = DIVIDE([Sensitivity_A] + [Specificity_A], 2)
BalancedAccuracy_B = DIVIDE([Sensitivity_B] + [Specificity_B], 2)
BalancedAccuracy_C = DIVIDE([Sensitivity_C] + [Specificity_C], 2)
BalancedAccuracy_D = DIVIDE([Sensitivity_D] + [Specificity_D], 2)
BalancedAccuracy_E = DIVIDE([Sensitivity_E] + [Specificity_E], 2)
Create an Overall Balanced Accuracy Measure:
DAX
OverallBalancedAccuracy = DIVIDE(
[BalancedAccuracy_A] + [BalancedAccuracy_B] + [BalancedAccuracy_C] + [BalancedAccuracy_D] + [BalancedAccuracy_E],
5
)
Proud to be a Super User! |
|
@prusik369 Create Measures for True Positives (TP), False Negatives (FN), False Positives (FP), and True Negatives (TN) for each class:
DAX
TP_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "A" && 'Table'[Prediction] = "A")
FN_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "A" && 'Table'[Prediction] <> "A")
FP_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "A" && 'Table'[Prediction] = "A")
TN_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "A" && 'Table'[Prediction] <> "A")
TP_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "B" && 'Table'[Prediction] = "B")
FN_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "B" && 'Table'[Prediction] <> "B")
FP_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "B" && 'Table'[Prediction] = "B")
TN_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "B" && 'Table'[Prediction] <> "B")
TP_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "C" && 'Table'[Prediction] = "C")
FN_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "C" && 'Table'[Prediction] <> "C")
FP_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "C" && 'Table'[Prediction] = "C")
TN_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "C" && 'Table'[Prediction] <> "C")
TP_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "D" && 'Table'[Prediction] = "D")
FN_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "D" && 'Table'[Prediction] <> "D")
FP_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "D" && 'Table'[Prediction] = "D")
TN_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "D" && 'Table'[Prediction] <> "D")
TP_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "E" && 'Table'[Prediction] = "E")
FN_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "E" && 'Table'[Prediction] <> "E")
FP_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "E" && 'Table'[Prediction] = "E")
TN_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "E" && 'Table'[Prediction] <> "E")
Create Measures for Sensitivity and Specificity for each class:
DAX
Sensitivity_A = DIVIDE([TP_A], [TP_A] + [FN_A])
Specificity_A = DIVIDE([TN_A], [TN_A] + [FP_A])
Sensitivity_B = DIVIDE([TP_B], [TP_B] + [FN_B])
Specificity_B = DIVIDE([TN_B], [TN_B] + [FP_B])
Sensitivity_C = DIVIDE([TP_C], [TP_C] + [FN_C])
Specificity_C = DIVIDE([TN_C], [TN_C] + [FP_C])
Sensitivity_D = DIVIDE([TP_D], [TP_D] + [FN_D])
Specificity_D = DIVIDE([TN_D], [TN_D] + [FP_D])
Sensitivity_E = DIVIDE([TP_E], [TP_E] + [FN_E])
Specificity_E = DIVIDE([TN_E], [TN_E] + [FP_E])
Create Measures for Balanced Accuracy for each class:
DAX
BalancedAccuracy_A = DIVIDE([Sensitivity_A] + [Specificity_A], 2)
BalancedAccuracy_B = DIVIDE([Sensitivity_B] + [Specificity_B], 2)
BalancedAccuracy_C = DIVIDE([Sensitivity_C] + [Specificity_C], 2)
BalancedAccuracy_D = DIVIDE([Sensitivity_D] + [Specificity_D], 2)
BalancedAccuracy_E = DIVIDE([Sensitivity_E] + [Specificity_E], 2)
Create an Overall Balanced Accuracy Measure:
DAX
OverallBalancedAccuracy = DIVIDE(
[BalancedAccuracy_A] + [BalancedAccuracy_B] + [BalancedAccuracy_C] + [BalancedAccuracy_D] + [BalancedAccuracy_E],
5
)
Proud to be a Super User! |
|
Hey @bhanu_gautam Thanks for answer. Really appreciate it! There is only one point. Sorry for not adding it. Number of classes is dynamic. Could 5 or could be 3 as well.
User | Count |
---|---|
16 | |
8 | |
7 | |
6 | |
6 |
User | Count |
---|---|
25 | |
13 | |
12 | |
8 | |
8 |