Publication | Closed Access
Computationally Efficient Sparse Bayesian Learning via Belief Propagation
67
Citations
33
References
2010
Year
Sparse RepresentationEngineeringMachine LearningData ScienceUncertainty QuantificationPattern RecognitionCompressive SensingSignal ReconstructionComputational ComplexityBayesian NetworkStatistical InferenceComputer ScienceInverse ProblemsSparse Bayesian LearningSignal ProcessingBelief PropagationBayesian InferenceBayesian Hierarchical Modeling
We present a belief propagation (BP)-based sparse Bayesian learning (SBL) algorithm, referred to as the BP-SBL, to recover sparse transform coefficients in large scale compressed sensing problems. BP-SBL is based on a widely used hierarchical Bayesian model, which is turned into a factor graph so that BP can be applied to achieve computational efficiency. We prove that the messages in BP are Gaussian probability density functions and therefore, we only need to update their means and variances when we update the messages. The computational complexity of BP-SBL is proportional to the number of transform coefficients, allowing the algorithms to deal with large scale compressed sensing problems efficiently. Numerical examples are provided to demonstrate the effectiveness of BP-SBL.
| Year | Citations | |
|---|---|---|
Page 1
Page 1