Save - Space alternating variational estimation for sparse Bayesian learning

Kurisummoottil Thomas, Christo; Slock, Dirk TM
DSW 2018, IEEE Data Science Workshop, June 4-6, 2018, Lausanne, Switzerland

In this paper, we address the fundamental problem of sparse signal recovery in a Bayesian framework. The computational complexity associated with Sparse Bayesian Learning (SBL) renders it infeasible even for moderately large problem sizes. To address this issue,
we propose a fast version of SBL using Variational Bayesian (VB) inference. VB allows one to obtain analytical approximations to the posterior distributions of interest even when exact inference of these distributions is intractable. We propose a novel fast algorithm
called space alternating variational estimation (SAVE), which is a version of VB(-SBL) pushed to the scalar level. Similarly as for SAGE (space-alternating generalized expectation maximization) compared to EM, the component-wise approach of SAVE compared to SBL renders it less likely to get stuck in bad local optima and its
inherent damping (more cautious progression) also leads to typically faster convergence of the non-convex optimization process. Simulation results show that the proposed algorithm has a faster convergence rate and achieves lower MSE than other state of the art fast SBL methods.

DOI
Type:
Conférence
City:
Lausanne
Date:
2018-06-04
Department:
Systèmes de Communication
Eurecom Ref:
5543
Copyright:
© 2018 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PERMALINK : https://www.eurecom.fr/publication/5543