Skip to yearly menu bar Skip to main content


Poster

Improper Gaussian process regression and improper kernels

Luca Ambrogioni


Abstract:

The behavior of a GP regression depends on the choice of covariance function. Stationary covariance functions are often preferred in machine learning applications. However, (non-periodic) stationary covariance functions are always mean reverting and can therefore exhibit pathological behavior when applied to data that does not relax to a fixed global mean value. In this paper, we show that it is possible to use improper GP priors with infinite variance to define processes that are stationary but not mean reverting. To this aim, we introduce the use of improper kernels that can only be defined in this limit regime. The resulting posterior distributions can be computed analytically and it involves a simple correction of the usual formulas. The main contribution of the paper is the introduction of a large family of smooth non-reverting improper covariance functions that closely resemble the kernels commonly used in the GP literature (e.g. squared exponential and Matérn class). By analyzing both synthetic and real data, we demonstrate that these improper kernels solve some known pathologies of mean reverting GP regression while retaining most of the favorable properties of ordinary smooth stationary kernels.

Live content is unavailable. Log in and register to view live content