Skip to yearly menu bar Skip to main content


Poster

Non-separable Non-stationary random fields

Kangrui Wang · Oliver Hamelijnck · Theodoros Damoulas · Mark Steel

Keywords: [ General Machine Learning Techniques ] [ Kernel Methods ] [ Gaussian Processes ] [ Bayesian Nonparametrics ]


Abstract:

We describe a framework for constructing nonstationary nonseparable random fields based on an infinite mixture of convolved stochastic processes. When the mixing process is stationary but the convolution function is nonstationary we arrive at nonseparable kernels with constant non-separability that are available in closed form. When the mixing is nonstationary and the convolution function is stationary we arrive at nonseparable random fields that have varying nonseparability and better preserve local structure. These fields have natural interpretations through the spectral representation of stochastic differential equations (SDEs) and are demonstrated on a range of synthetic benchmarks and spatio-temporal applications in geostatistics and machine learning. We show how a single Gaussian process (GP) with these random fields can computationally and statistically outperform both separable and existing nonstationary nonseparable approaches such as treed GPs and deep GP constructions.

Chat is not available.