Performance Metrics
There are different evaluation metrics in machine learning, and these depend on the type of data and the requirements. Some of the metrics are as follows:
- Confusion matrix
- Precision
- Recall
- Accuracy
- F1 score
Confusion Matrix
A confusion matrix is a table that is used to define the performance of the classification model on the test data for which the actual values are known. To understand this better, look at the following figure, showing predicted and actual values:

Figure 1.54: Predicted versus actual values
Let's examine the concept of a confusion matrix and its metrics, TP, TN, FP, and FN, in detail. Assume you are building a model that predicts pregnancy:
- TP (True Positive): The sex is female and she is actually pregnant, and your model also predicted
True
. - FP (False Positive): The sex is male and your model predicted
True
, which cannot happen. This is a type of error called a...