Skip to yearly menu bar Skip to main content


Poster

Nonparametric Bayesian Deep Networks with Local Competition

Konstantinos Panousis · Sotirios Chatzis · Sergios Theodoridis

Pacific Ballroom #82

Keywords: [ Bayesian Nonparametrics ] [ Bayesian Methods ] [ Bayesian Deep Learning ]


Abstract:

The aim of this work is to enable inference of deep networks that retain high accuracy for the least possible model complexity, with the latter deduced from the data during inference. To this end, we revisit deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition. In this context, our main technical innovation consists in an inferential setup that leverages solid arguments from Bayesian nonparametrics. We infer both the needed set of connections or locally competing sets of units, as well as the required floating-point precision for storing the network parameters. Specifically, we introduce auxiliary discrete latent variables representing which initial network components are actually needed for modeling the data at hand, and perform Bayesian inference over them by imposing appropriate stick-breaking priors. As we experimentally show using benchmark datasets, our approach yields networks with less computational footprint than the state-of-the-art, and with no compromises in predictive accuracy.

Live content is unavailable. Log in and register to view live content