class sklearn.mixture.GaussianMixture(n_components=1, covariance_type=’full’, tol=0.001, reg_covar=1e-06, max_iter=100, n_init=1, init_params=’kmeans’, weights_init=None, means_init=None, precisions_init=None, random_state=None, warm_start=False, verbose=0, verbose_interval=10) [source]
Gaussian Mixture.
Representation of a Gaussian mixture model probability distribution. This class allows to estimate the parameters of a Gaussian mixture distribution.
Read more in the User Guide.
New in version 0.18.
| Parameters: |
|
|---|---|
| Attributes: |
|
See also
BayesianGaussianMixture
aic(X) | Akaike information criterion for the current model on the input X. |
bic(X) | Bayesian information criterion for the current model on the input X. |
fit(X[, y]) | Estimate model parameters with the EM algorithm. |
fit_predict(X[, y]) | Estimate model parameters using X and predict the labels for X. |
get_params([deep]) | Get parameters for this estimator. |
predict(X) | Predict the labels for the data samples in X using trained model. |
predict_proba(X) | Predict posterior probability of each component given the data. |
sample([n_samples]) | Generate random samples from the fitted Gaussian distribution. |
score(X[, y]) | Compute the per-sample average log-likelihood of the given data X. |
score_samples(X) | Compute the weighted log probabilities for each sample. |
set_params(**params) | Set the parameters of this estimator. |
__init__(n_components=1, covariance_type=’full’, tol=0.001, reg_covar=1e-06, max_iter=100, n_init=1, init_params=’kmeans’, weights_init=None, means_init=None, precisions_init=None, random_state=None, warm_start=False, verbose=0, verbose_interval=10) [source]
aic(X) [source]
Akaike information criterion for the current model on the input X.
| Parameters: |
|
|---|---|
| Returns: |
|
bic(X) [source]
Bayesian information criterion for the current model on the input X.
| Parameters: |
|
|---|---|
| Returns: |
|
fit(X, y=None) [source]
Estimate model parameters with the EM algorithm.
The method fits the model n_init times and sets the parameters with which the model has the largest likelihood or lower bound. Within each trial, the method iterates between E-step and M-step for max_iter times until the change of likelihood or lower bound is less than tol, otherwise, a ConvergenceWarning is raised. If warm_start is True, then n_init is ignored and a single initialization is performed upon the first call. Upon consecutive calls, training starts where it left off.
| Parameters: |
|
|---|---|
| Returns: |
|
fit_predict(X, y=None) [source]
Estimate model parameters using X and predict the labels for X.
The method fits the model n_init times and sets the parameters with which the model has the largest likelihood or lower bound. Within each trial, the method iterates between E-step and M-step for max_iter times until the change of likelihood or lower bound is less than tol, otherwise, a ConvergenceWarning is raised. After fitting, it predicts the most probable label for the input data points.
New in version 0.20.
| Parameters: |
|
|---|---|
| Returns: |
|
get_params(deep=True) [source]
Get parameters for this estimator.
| Parameters: |
|
|---|---|
| Returns: |
|
predict(X) [source]
Predict the labels for the data samples in X using trained model.
| Parameters: |
|
|---|---|
| Returns: |
|
predict_proba(X) [source]
Predict posterior probability of each component given the data.
| Parameters: |
|
|---|---|
| Returns: |
|
sample(n_samples=1) [source]
Generate random samples from the fitted Gaussian distribution.
| Parameters: |
|
|---|---|
| Returns: |
|
score(X, y=None) [source]
Compute the per-sample average log-likelihood of the given data X.
| Parameters: |
|
|---|---|
| Returns: |
|
score_samples(X) [source]
Compute the weighted log probabilities for each sample.
| Parameters: |
|
|---|---|
| Returns: |
|
set_params(**params) [source]
Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.
| Returns: |
|
|---|
sklearn.mixture.GaussianMixture
© 2007–2018 The scikit-learn developers
Licensed under the 3-clause BSD License.
http://scikit-learn.org/stable/modules/generated/sklearn.mixture.GaussianMixture.html