This documentation is for scikit-learn version 0.11-gitOther versions

Citing

If you use the software, please consider citing scikit-learn.

This page

8.2.1. sklearn.covariance.EmpiricalCovariance

class sklearn.covariance.EmpiricalCovariance(store_precision=True, assume_centered=False)

Maximum likelihood covariance estimator

Parameters :

store_precision : bool

Specifies if the estimated precision is stored

Attributes

covariance_ 2D ndarray, shape (n_features, n_features) Estimated covariance matrix
precision_ 2D ndarray, shape (n_features, n_features) Estimated pseudo-inverse matrix. (stored only if store_precision is True)

Methods

error_norm(comp_cov[, norm, scaling, squared]) Computes the Mean Squared Error between two covariance estimators.
fit(X) Fits the Maximum Likelihood Estimator covariance model
mahalanobis(observations) Computes the mahalanobis distances of given observations.
score(X_test[, assume_centered]) Computes the log-likelihood of a gaussian data set with self.covariance_ as an estimator of its covariance matrix.
set_params(**params) Set the parameters of the estimator.
__init__(store_precision=True, assume_centered=False)
Parameters :

store_precision: bool :

Specify if the estimated precision is stored

assume_centered: Boolean :

If True, data are not centered before computation. Useful when working with data whose mean is almost, but not exactly zero. If False, data are centered before computation.

error_norm(comp_cov, norm='frobenius', scaling=True, squared=True)

Computes the Mean Squared Error between two covariance estimators. (In the sense of the Frobenius norm)

Parameters :

comp_cov: array-like, shape = [n_features, n_features] :

The covariance to compare with.

norm: str :

The type of norm used to compute the error. Available error types: - ‘frobenius’ (default): sqrt(tr(A^t.A)) - ‘spectral’: sqrt(max(eigenvalues(A^t.A)) where A is the error (comp_cov - self.covariance_).

scaling: bool :

If True (default), the squared error norm is divided by n_features. If False, the squared error norm is not rescaled.

squared: bool :

Whether to compute the squared error norm or the error norm. If True (default), the squared error norm is returned. If False, the error norm is returned.

Returns :

The Mean Squared Error (in the sense of the Frobenius norm) between :

`self` and `comp_cov` covariance estimators. :

fit(X)

Fits the Maximum Likelihood Estimator covariance model according to the given training data and parameters.

Parameters :

X : array-like, shape = [n_samples, n_features]

Training data, where n_samples is the number of samples and n_features is the number of features.

Returns :

self : object

Returns self.

mahalanobis(observations)

Computes the mahalanobis distances of given observations.

The provided observations are assumed to be centered. One may want to center them using a location estimate first.

Parameters :

observations: array-like, shape = [n_observations, n_features] :

The observations, the Mahalanobis distances of the which we compute.

Returns :

mahalanobis_distance: array, shape = [n_observations,] :

Mahalanobis distances of the observations.

score(X_test, assume_centered=False)

Computes the log-likelihood of a gaussian data set with self.covariance_ as an estimator of its covariance matrix.

Parameters :

X_test : array-like, shape = [n_samples, n_features]

Test data of which we compute the likelihood, where n_samples is the number of samples and n_features is the number of features.

Returns :

res : float

The likelihood of the data set with self.covariance_ as an estimator of its covariance matrix.

set_params(**params)

Set the parameters of the estimator.

The method works on simple estimators as well as on nested objects (such as pipelines). The former have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Returns :self :