← retour aux snippets

sklearn: ROC AUC et Precision-Recall

Mesurer la qualité des scores probabilistes.

objectif

Mesurer la qualité des scores probabilistes.

code minimal

from sklearn.metrics import roc_auc_score, average_precision_score
y_true = [0,0,1,1]; y_score = [0.1,0.4,0.35,0.8]
print(round(roc_auc_score(y_true, y_score), 2) <= 1.0)

utilisation

from sklearn.metrics import classification_report
y_pred = [0,0,1,1]; y = [0,1,1,1]
print(isinstance(classification_report(y, y_pred), str))

variante(s) utile(s)

from sklearn.metrics import confusion_matrix
print(confusion_matrix([0,1],[0,1]).shape == (2,2))

notes

  • Préférer PR-AUC quand classes très déséquilibrées.