True positive (true, TP) is predicted as positive samples by the model;
True negative (true negative, TN) is a negative sample predicted as negative by the model;
False Positive (false positive, FP) is predicted as positive negative samples by the model;
False Negative (false negative, FN) positive samples predicted as negative by the model;
True positive rate (true rate, TPR) or sensitivity)
TPR = TP/(TP + FN)
Number of positive sample prediction results/actual number of positive samples
True negative rate (true negative rate, tnr) or specificity)
Tnr = TN/(TN + FP)
Number of negative sample prediction results/actual number of negative samples
False positive rate (false positive rate, FPR)
FPR = FP/(FP + Tn)
Predicted positive negative sample result count/Actual negative sample count
False negative rate (false negative rate, FNR)
FNR = FN/(TP + FN)
Number of predicted positive sample results/actual number of positive samples
Recall rate and accuracy:
-Documents retrieved by the system ()
-Unrelated documents retrieved by the system (B)
-Related documents not retrieved by the system (c)
-Irrelevant documents retrieved by the system (d)
Intuitively, the more relevant documents retrieved by a good retrieval system, the better. The fewer irrelevant documents, the better.
Recall rate and accuracy are the most important parameters for measuring the performance of information retrieval systems.
Recall rate R: The number of retrieved documents is used as the numerator, and the total number of all relevant documents is used as the denominator, that is, r = A/(A + C)
Precision P:The number of retrieved documents is used as the numerator, and the total number of retrieved documents is used as the denominator. That is, P = A/(A + B ).
|
Retrieved |
A |
B |
Not retrieved |
C |
D |
Related
Unrelated