混同行列(Confusion Matrix)

推論クラス (predicted class)
正(P)負(N)
真のクラス(true class)正(P)True Positive(TP:真陽性)False Negative(FN:偽陰性)
負(N)False Positive(FP:偽陽性)True Negative(TN:真陰性)

Terms

  • FPR (False Positive Rate, 偽陽性率)
    • FPR = FP / (FP + TN)
  • TPR (True Positive Rate, 真陽性率)
    • TPR= TP / (TP + FN)
  • Precision (適合率)
    • Precision = TP / (TP + FP)
    • 正と予測したデータのうち,実際に正であるものの割合
    • 分母が検知したオブジェクト数
  • Recall (再現率)
    • Recall = TP / (TP + FN)
    • 実際に正であるもののうち,正であると予測されたものの割合
    • 分母が検知するべきオブジェクト数
  • Accuracy (精度、正解率)
    • Accuracy = TP + TN / (TP + TN + FP + FN )
  • Specificity (特異度)
    • Specificity = TN / (FP + TN)
  • F1-Score (F値)
    • F1-Score = 2 * (recall * precision) / (recall + precision)
  • Roc curve (Receiver operating characteristic curve, ROC曲線)
    • x axis is TPR
    • y axis is FPR
  • AUC (area under the curve)
    • area of Roc curve
    • value range is between 0 to 1
  • Precision-Recall curve (RP曲線)
    • x axis is Recall
    • y axis is Precision
  • AP (average precision)
    • area of Precision-Recall curve
    • value range is between 0 to 1
    • formula of calculation is depended on Dataset (Pascal VOC, MS COCO, etc)
  • mAP (mean average precision)
    • mean of AP
    • value range is between 0 to 1

NOTE:

  • TPR = Recall

Axis

  • Row oriented
    • FPR & TPR
  • TP oriented
    • Precision & Recall

references:

  • http://ibisforest.org/index.php?F%E5%80%A4
  • https://www.randpy.tokyo/entry/roc_auc
  • https://github.com/AlexeyAB/darknet#when-should-i-stop-training
  • https://qiita.com/FukuharaYohei/items/be89a99c53586fa4e2e4