Computationally Efficient Sparse Bayesian Learning via Generalized Approximate Message Passing

From MaRDI portal
Publication:6258275

arXiv1501.04762MaRDI QIDQ6258275

Fuwei Li, Huiping Duan, Hongbin Li, Jun Fang, Zhi Chen

Publication date: 20 January 2015

Abstract: The sparse Beyesian learning (also referred to as Bayesian compressed sensing) algorithm is one of the most popular approaches for sparse signal recovery, and has demonstrated superior performance in a series of experiments. Nevertheless, the sparse Bayesian learning algorithm has computational complexity that grows exponentially with the dimension of the signal, which hinders its application to many practical problems even with moderately large data sets. To address this issue, in this paper, we propose a computationally efficient sparse Bayesian learning method via the generalized approximate message passing (GAMP) technique. Specifically, the algorithm is developed within an expectation-maximization (EM) framework, using GAMP to efficiently compute an approximation of the posterior distribution of hidden variables. The hyperparameters associated with the hierarchical Gaussian prior are learned by iteratively maximizing the Q-function which is calculated based on the posterior approximation obtained from the GAMP. Numerical results are provided to illustrate the computational efficacy and the effectiveness of the proposed algorithm.




Has companion code repository: https://github.com/livey/GAMP_SBL








This page was built for publication: Computationally Efficient Sparse Bayesian Learning via Generalized Approximate Message Passing

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6258275)