This documentation is for scikit-learn version 0.11-gitOther versions

Citing

If you use the software, please consider citing scikit-learn.

This page

8.16.1.10. sklearn.metrics.precision_recall_curve

sklearn.metrics.precision_recall_curve(y_true, probas_pred)

Compute precision-recall pairs for different probability thresholds

Note: this implementation is restricted to the binary classification task.

The precision is the ratio tp / (tp + fp) where tp is the number of true positives and fp the number of false positives. The precision is intuitively the ability of the classifier not to label as positive a sample that is negative.

The recall is the ratio tp / (tp + fn) where tp is the number of true positives and fn the number of false negatives. The recall is intuitively the ability of the classifier to find all the positive samples.

Parameters :

y_true : array, shape = [n_samples]

true targets of binary classification in range {-1, 1} or {0, 1}

probas_pred : array, shape = [n_samples]

estimated probabilities

Returns :

precision : array, shape = [n]

Precision values

recall : array, shape = [n]

Recall values

thresholds : array, shape = [n]

Thresholds on y_score used to compute precision and recall