Publication | Closed Access
Propagation Algorithms for Variational Bayesian Learning
288
Citations
8
References
2000
Year
Unknown Venue
Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. The authors provide theoretical results for variational updates in a broad class of conjugate‑exponential graphical models. They demonstrate that belief propagation and junction tree algorithms can be used to perform the inference step in variational Bayesian learning. The approach yields a learning procedure for linear‑Gaussian state‑space models that leverages Kalman smoothing and integrates over parameters, and it can infer hidden state dimensionality in synthetic and real high‑dimensional datasets.
Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoretical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning. Applying these results to the Bayesian analysis of linear-Gaussian state-space models we obtain a learning procedure that exploits the Kalman smoothing propagation, while integrating over all model parameters. We demonstrate how this can be used to infer the hidden state dimensionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set.
| Year | Citations | |
|---|---|---|
Page 1
Page 1