Concepedia

Publication | Closed Access

Guaranteed Bounds on the Kullback–Leibler Divergence of Univariate Mixtures

50

Citations

28

References

2016

Year

Abstract

The Kullback-Leibler (KL) divergence between two mixture models is a fundamental primitive in many signal processing tasks. Since the KL divergence of mixtures does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration or approximated. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the KL divergence of univariate mixtures. We illustrate the versatile method by reporting on our experiments for approximating the KL divergence between Gaussian mixture models.

References

YearCitations

Page 1