8.16.1.7. sklearn.metrics.f1_score¶
- sklearn.metrics.f1_score(y_true, y_pred, pos_label=1)¶
Compute f1 score
The F1 score can be interpreted as a weighted average of the precision and recall, where an F1 score reaches its best value at 1 and worst score at 0. The relative contribution of precision and recall to the f1 score are equal.
F_1 = 2 * (precision * recall) / (precision + recall)See: http://en.wikipedia.org/wiki/F1_score
In the multi-class case, this is the weighted average of the f1-score of each class.
Parameters : y_true : array, shape = [n_samples]
true targets
y_pred : array, shape = [n_samples]
predicted targets
pos_label : int
in the binary classification case, give the label of the positive class (default is 1). Everything else but ‘pos_label’ is considered to belong to the negative class. Not used in the case of multiclass classification.
Returns : f1_score : float
f1_score of the positive class in binary classification or weighted avergage of the f1_scores of each class for the multiclass task
Notes
References: http://en.wikipedia.org/wiki/F1_score