This documentation is for scikit-learn version 0.11-gitOther versions

Citing

If you use the software, please consider citing scikit-learn.

This page

8.17.1.13. sklearn.metrics.hinge_loss

sklearn.metrics.hinge_loss(y_true, pred_decision, pos_label=1, neg_label=-1)

Cumulated hinge loss (non-regularized).

Assuming labels in y_true are encoded with +1 and -1, when a prediction mistake is made, margin = y_true * pred_decision is always negative (since the signs disagree), therefore 1 - margin is always greater than 1. The cumulated hinge loss therefore upperbounds the number of mistakes made by the classifier.

Parameters :

y_true : array, shape = [n_samples]

True target (integers)

pred_decision : array, shape = [n_samples] or [n_samples, n_classes]

Predicted decisions, as output by decision_function (floats)