Publication | Closed Access
Unsupervised learning of finite mixture models
2.1K
Citations
51
References
2002
Year
Em AlgorithmMixture DistributionEngineeringMachine LearningMixture ModelsData ScienceData MiningKnowledge DiscoveryMixture Of ExpertUnsupervised Machine LearningStatistical InferenceFinite Mixture ModelsStatistical Learning TheoryUnsupervised LearningStatisticsFinite Mixture Model
The authors propose an unsupervised algorithm that learns finite mixture models from multivariate data while simultaneously selecting the number of components. The method integrates estimation and model selection, requires no careful initialization, avoids singular convergence, and can be applied to any parametric mixture with an EM routine, as demonstrated on Gaussian mixtures. Experiments confirm the algorithm’s strong performance.
This paper proposes an unsupervised algorithm for learning a finite mixture model from multivariate data. The adjective "unsupervised" is justified by two properties of the algorithm: 1) it is capable of selecting the number of components and 2) unlike the standard expectation-maximization (EM) algorithm, it does not require careful initialization. The proposed method also avoids another drawback of EM for mixture fitting: the possibility of convergence toward a singular estimate at the boundary of the parameter space. The novelty of our approach is that we do not use a model selection criterion to choose one among a set of preestimated candidate models; instead, we seamlessly integrate estimation and model selection in a single algorithm. Our technique can be applied to any type of parametric mixture model for which it is possible to write an EM algorithm; in this paper, we illustrate it with experiments involving Gaussian mixtures. These experiments testify for the good performance of our approach.
| Year | Citations | |
|---|---|---|
Page 1
Page 1