The confusion matrix is an indicator that evaluates the performance of a specific classification model. It arranges the actual values and the predicted values and classifies each values on whether they are true or false. The confusion matrix of binary classification is as follows.
If you have more than one class, you can extend the matrix in a similar way. Below image is an example of how a confusion matrix of a case with three classes.
Confusion matrix is composed of four components.
True Positive(TP): Predicted positive and it is true.
True Negative(TN): Predicted negative and it is true.
False Positive (FP): Predicted positive and it is false.
False Negative(FN): Predicted negative and it is false.
Various model evaluation indices are available using confusion matrix.
1. Accuracy: the ratio of correctly predicted observation to the total observations
2. Recall: the ratio of correctly predicted positive observations to the all observations in actual class
3. Precision: the ratio of correctly predicted positive observations to the total predicted positive observations.
4. F1-Score : the weighted average of Precision and Recall.
The above table is an example of the actual and predicted values. In this case, TP=6, TN=3, FP=1, and FN=2. Accuracy could be calculated using these information.
[Click 'details - Confusion Matrix' to see the confusion matrix table of the selected model.]
[Click 'details - Precise Analysis' to see more model evaluations indices.]