Publication | Closed Access
Guaranteed Bounds on the Kullback–Leibler Divergence of Univariate Mixtures
50
Citations
28
References
2016
Year
Statistical Signal ProcessingMixture DistributionEngineeringMachine LearningData ScienceMixture ModelsEntropyUnivariate MixturesMixture AnalysisGaussian Mixture ModelsDensity EstimationMixture Of ExpertSpeech ProcessingStatistical InferenceProbability TheoryKl DivergenceSignal ProcessingStatistics
The Kullback-Leibler (KL) divergence between two mixture models is a fundamental primitive in many signal processing tasks. Since the KL divergence of mixtures does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration or approximated. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the KL divergence of univariate mixtures. We illustrate the versatile method by reporting on our experiments for approximating the KL divergence between Gaussian mixture models.
| Year | Citations | |
|---|---|---|
Page 1
Page 1