Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
prusik369
New Member

Balanced accuracy for multi-class

I have problem with  calculating  in power Bi  balanced accuracy. I would like to see  balnaced  accuracy  for every class  and overall without dividing on classes. I have calumns like  Actuals, Prediction and CorrectPrediction. Tried to have one metric.  Important: Number of classes is dynamic. Could be 3 as well as 5.

From my data when I will  create matrix with Prediction  in columns and Actuals in Rows  I am getting this : 

   PREDICTION   
  ABCDEsum from row
 A382172175386962
 B17829254149591497
TRUEC64161641116961996
 D4084411321982376
 E6719758202597426316
sum from column-7319062346331213

I should get these calcualtions:

prusik369_0-1747901066420.png

but i am getting  completely wrong results. 
Can you help me with appriopriate DAX command?

1 ACCEPTED SOLUTION
bhanu_gautam
Super User
Super User

@prusik369 Create Measures for True Positives (TP), False Negatives (FN), False Positives (FP), and True Negatives (TN) for each class:

DAX
TP_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "A" && 'Table'[Prediction] = "A")
FN_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "A" && 'Table'[Prediction] <> "A")
FP_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "A" && 'Table'[Prediction] = "A")
TN_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "A" && 'Table'[Prediction] <> "A")

TP_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "B" && 'Table'[Prediction] = "B")
FN_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "B" && 'Table'[Prediction] <> "B")
FP_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "B" && 'Table'[Prediction] = "B")
TN_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "B" && 'Table'[Prediction] <> "B")

TP_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "C" && 'Table'[Prediction] = "C")
FN_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "C" && 'Table'[Prediction] <> "C")
FP_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "C" && 'Table'[Prediction] = "C")
TN_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "C" && 'Table'[Prediction] <> "C")

TP_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "D" && 'Table'[Prediction] = "D")
FN_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "D" && 'Table'[Prediction] <> "D")
FP_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "D" && 'Table'[Prediction] = "D")
TN_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "D" && 'Table'[Prediction] <> "D")

TP_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "E" && 'Table'[Prediction] = "E")
FN_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "E" && 'Table'[Prediction] <> "E")
FP_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "E" && 'Table'[Prediction] = "E")
TN_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "E" && 'Table'[Prediction] <> "E")

 

Create Measures for Sensitivity and Specificity for each class:

DAX
Sensitivity_A = DIVIDE([TP_A], [TP_A] + [FN_A])
Specificity_A = DIVIDE([TN_A], [TN_A] + [FP_A])

Sensitivity_B = DIVIDE([TP_B], [TP_B] + [FN_B])
Specificity_B = DIVIDE([TN_B], [TN_B] + [FP_B])

Sensitivity_C = DIVIDE([TP_C], [TP_C] + [FN_C])
Specificity_C = DIVIDE([TN_C], [TN_C] + [FP_C])

Sensitivity_D = DIVIDE([TP_D], [TP_D] + [FN_D])
Specificity_D = DIVIDE([TN_D], [TN_D] + [FP_D])

Sensitivity_E = DIVIDE([TP_E], [TP_E] + [FN_E])
Specificity_E = DIVIDE([TN_E], [TN_E] + [FP_E])

 

Create Measures for Balanced Accuracy for each class:

DAX
BalancedAccuracy_A = DIVIDE([Sensitivity_A] + [Specificity_A], 2)
BalancedAccuracy_B = DIVIDE([Sensitivity_B] + [Specificity_B], 2)
BalancedAccuracy_C = DIVIDE([Sensitivity_C] + [Specificity_C], 2)
BalancedAccuracy_D = DIVIDE([Sensitivity_D] + [Specificity_D], 2)
BalancedAccuracy_E = DIVIDE([Sensitivity_E] + [Specificity_E], 2)

 

Create an Overall Balanced Accuracy Measure:

DAX
OverallBalancedAccuracy = DIVIDE(
[BalancedAccuracy_A] + [BalancedAccuracy_B] + [BalancedAccuracy_C] + [BalancedAccuracy_D] + [BalancedAccuracy_E],
5
)




Did I answer your question? Mark my post as a solution! And Kudos are appreciated

Proud to be a Super User!




LinkedIn






View solution in original post

2 REPLIES 2
bhanu_gautam
Super User
Super User

@prusik369 Create Measures for True Positives (TP), False Negatives (FN), False Positives (FP), and True Negatives (TN) for each class:

DAX
TP_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "A" && 'Table'[Prediction] = "A")
FN_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "A" && 'Table'[Prediction] <> "A")
FP_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "A" && 'Table'[Prediction] = "A")
TN_A = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "A" && 'Table'[Prediction] <> "A")

TP_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "B" && 'Table'[Prediction] = "B")
FN_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "B" && 'Table'[Prediction] <> "B")
FP_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "B" && 'Table'[Prediction] = "B")
TN_B = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "B" && 'Table'[Prediction] <> "B")

TP_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "C" && 'Table'[Prediction] = "C")
FN_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "C" && 'Table'[Prediction] <> "C")
FP_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "C" && 'Table'[Prediction] = "C")
TN_C = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "C" && 'Table'[Prediction] <> "C")

TP_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "D" && 'Table'[Prediction] = "D")
FN_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "D" && 'Table'[Prediction] <> "D")
FP_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "D" && 'Table'[Prediction] = "D")
TN_D = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "D" && 'Table'[Prediction] <> "D")

TP_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "E" && 'Table'[Prediction] = "E")
FN_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] = "E" && 'Table'[Prediction] <> "E")
FP_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "E" && 'Table'[Prediction] = "E")
TN_E = CALCULATE(COUNTROWS('Table'), 'Table'[Actuals] <> "E" && 'Table'[Prediction] <> "E")

 

Create Measures for Sensitivity and Specificity for each class:

DAX
Sensitivity_A = DIVIDE([TP_A], [TP_A] + [FN_A])
Specificity_A = DIVIDE([TN_A], [TN_A] + [FP_A])

Sensitivity_B = DIVIDE([TP_B], [TP_B] + [FN_B])
Specificity_B = DIVIDE([TN_B], [TN_B] + [FP_B])

Sensitivity_C = DIVIDE([TP_C], [TP_C] + [FN_C])
Specificity_C = DIVIDE([TN_C], [TN_C] + [FP_C])

Sensitivity_D = DIVIDE([TP_D], [TP_D] + [FN_D])
Specificity_D = DIVIDE([TN_D], [TN_D] + [FP_D])

Sensitivity_E = DIVIDE([TP_E], [TP_E] + [FN_E])
Specificity_E = DIVIDE([TN_E], [TN_E] + [FP_E])

 

Create Measures for Balanced Accuracy for each class:

DAX
BalancedAccuracy_A = DIVIDE([Sensitivity_A] + [Specificity_A], 2)
BalancedAccuracy_B = DIVIDE([Sensitivity_B] + [Specificity_B], 2)
BalancedAccuracy_C = DIVIDE([Sensitivity_C] + [Specificity_C], 2)
BalancedAccuracy_D = DIVIDE([Sensitivity_D] + [Specificity_D], 2)
BalancedAccuracy_E = DIVIDE([Sensitivity_E] + [Specificity_E], 2)

 

Create an Overall Balanced Accuracy Measure:

DAX
OverallBalancedAccuracy = DIVIDE(
[BalancedAccuracy_A] + [BalancedAccuracy_B] + [BalancedAccuracy_C] + [BalancedAccuracy_D] + [BalancedAccuracy_E],
5
)




Did I answer your question? Mark my post as a solution! And Kudos are appreciated

Proud to be a Super User!




LinkedIn






Hey @bhanu_gautam Thanks  for  answer. Really appreciate it! There is only one  point. Sorry for  not adding it. Number of classes is dynamic. Could 5 or could be 3 as well. 

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.