Skip to yearly menu bar Skip to main content


Poster

Accelerating Convergence in Bayesian Few-Shot Classification

Tianjun Ke · Haoqun Cao · Feng Zhou


Abstract:

Bayesian few-shot classification has been a focal point in the field of few-shot learning. This paper seamlessly integrates mirror descent-based variational inference into Gaussian process-based few-shot classification, addressing the challenge of non-conjugate inference. By leveraging non-Euclidean geometry, mirror descent achieves accelerated convergence by providing the steepest descent direction along the corresponding manifold. It also exhibits the parameterization invariance property concerning the variational distribution. Experimental results demonstrate competitive classification accuracy, improved uncertainty quantification, and faster convergence compared to baseline models. Additionally, we investigate the impact of hyperparameters and components.

Live content is unavailable. Log in and register to view live content