Confusion Matrix
Confusion Matrix
Actual values = [‘dog’, ‘cat’, ‘dog’, ‘cat’, ‘dog’, ‘dog’, ‘cat’, ‘dog’,
‘cat’, ‘dog’]
Predicted values = [‘dog’, ‘dog’, ‘dog’, ‘cat’, ‘dog’, ‘dog’, ‘cat’, ‘cat’,
‘cat’, ‘cat’]
Classification Accuracy:
Classification Accuracy is given by the relation:
Precision:
Precision is defined as the ratio of the total number of correctly
classified positive classes divided by the total number of
predicted positive classes. Or, out of all the predictive positive
classes, how much we predicted correctly. Precision should be
high.
F-score or F1-score:
It is difficult to compare two models with different Precision
and Recall. So to make them comparable, we use F-Score. It is
the Harmonic Mean of Precision and Recall. As compared to
Arithmetic Mean, Harmonic Mean punishes the extreme values
more. F-score should be high.
Specificity:
Specificity determines the proportion of actual negatives that
are correctly identified.
Example to interpret confusion matrix:
Let’s calculate confusion matrix using above cat and dog
example:
Classification Accuracy:
Accuracy = (TP + TN) / (TP + TN + FP + FN) =
(3+4)/(3+4+2+1) = 0.70
Recall: Recall gives us an idea about when it’s actually yes, how
often does it predict yes.
Recall = TP / (TP + FN) = 3/(3+1) = 0.75
F-score:
F-score = (2*Recall*Precision)/(Recall+Presision) =
(2*0.75*0.60)/(0.75+0.60) = 0.67
Specificity:
Specificity = TN / (TN + FP) = 4/(4+2) = 0.67