Skip to yearly menu bar Skip to main content


Poster

An Iterative Min-Min Optimization Method for Sparse Bayesian Learning

Yasen Wang · Junlin Li · Zuogong Yue · Ye Yuan


Abstract:

As a well-known machine learning algorithm, sparse Bayesian learning (SBL) can find sparse representations in linearly probabilistic models by imposing a sparsity-promoting prior on model coefficients. However, classical SBL algorithms lack the essential theoretical guarantees of global convergence. To address this issue, we propose an iterative Min-Min optimization method to solve the marginal likelihood function (MLF) of SBL. The method can optimize the hyperparameters related to both the prior and noise level analytically at each iteration by re-expressing MLF using auxiliary functions. Particularly, we demonstrate that the method globally converges to a local minimum or saddle point of MLF. With rigorous theoretical guarantees, the proposed novel SBL algorithm outperforms classical ones in finding sparse representations on simulation and real-world examples, ranging from system identification to sparse signal recovery and kernel regression.

Live content is unavailable. Log in and register to view live content