Skip to yearly menu bar Skip to main content


Poster

Generalized Approximate Survey Propagation for High-Dimensional Estimation

Carlo Lucibello · Luca Saglietti · Yue Lu

Pacific Ballroom #160

Keywords: [ Sparsity and Compressed Sensing ] [ Non-convex Optimization ] [ Information Theory and Estimation ] [ Bayesian Methods ] [ Approximate Inference ]


Abstract:

In Generalized Linear Estimation (GLE) problems, we seek to estimate a signal that is observed through a linear transform followed by a component-wise, possibly nonlinear and noisy, channel. In the Bayesian optimal setting, Generalized Approximate Message Passing (GAMP) is known to achieve optimal performance for GLE. However, its performance can significantly deteriorate whenever there is a mismatch between the assumed and the true generative model, a situation frequently encountered in practice. In this paper, we propose a new algorithm, named Generalized Approximate Survey Propagation (GASP), for solving GLE in the presence of prior or model misspecifications. As a prototypical example, we consider the phase retrieval problem, where we show that GASP outperforms the corresponding GAMP, reducing the reconstruction threshold and, for certain choices of its parameters, approaching Bayesian optimal performance. Furthermore, we present a set of state evolution equations that can precisely characterize the performance of GASP in the high-dimensional limit.

Live content is unavailable. Log in and register to view live content