Publication | Open Access
Bounds for Kullback-Leibler divergence
19
Citations
3
References
2016
Year
Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in the information theory. The purpose of this paper is to present new bounds for relative entropy $D(p||q)$ of two probability distributions and then to apply them to simple entropy and mutual information. The relative entropy upper bound obtained is a refinement of a bound previously presented into literature.
| Year | Citations | |
|---|---|---|
Page 1
Page 1