Publication | Open Access
$I$-Divergence Geometry of Probability Distributions and Minimization Problems
1.6K
Citations
16
References
1975
Year
Mathematical ProgrammingEngineeringMinimization ProblemsSemidefinite ProgrammingPattern RecognitionStochastic GeometryCombinatorial OptimizationComputational GeometryApproximation TheoryLow-rank ApproximationMinimizing PdDirichlet FormInverse ProblemsProbability TheoryComputer ScienceConic OptimizationLagrange Multiplier TechniqueNatural GeneralizationConvex OptimizationSemi-definite Optimization
Kullback’s I‑divergence functions as a squared Euclidean distance, revealing geometric properties of probability distributions. The authors avoid the Lagrange multiplier method, employing an alternative analytical approach. They demonstrate that minimizing discrimination information is equivalent to projecting a probability distribution onto a convex set, establish existence and characterization of the minimizer, generalize iterative algorithms with a broader convergence proof, and extend results on the existence of probability distributions and certain nonnegative matrices.
Some geometric properties of PD's are established, Kullback's $I$-divergence playing the role of squared Euclidean distance. The minimum discrimination information problem is viewed as that of projecting a PD onto a convex set of PD's and useful existence theorems for and characterizations of the minimizing PD are arrived at. A natural generalization of known iterative algorithms converging to the minimizing PD in special situations is given; even for those special cases, our convergence proof is more generally valid than those previously published. As corollaries of independent interest, generalizations of known results on the existence of PD's or nonnegative matrices of a certain form are obtained. The Lagrange multiplier technique is not used.
| Year | Citations | |
|---|---|---|
Page 1
Page 1