Skip to yearly menu bar Skip to main content


Poster

High-Dimensional Kernel Methods under Covariate Shift: Data-Dependent Implicit Regularization

Yihang Chen · Fanghui Liu · Taiji Suzuki · Volkan Cevher


Abstract:

This paper analyzes the role of importance re-weighting in a high-capacity model under covariate shift, i.e., kernel ridge regression in high dimensions. After a bias-variance decomposition, we theoretically demonstrate that the re-weighting strategy allows for a decreasing variance. For bias, one part (the re-weighting bias) can be reduced but another part (the intrinsic bias) cannot decrease due to the covariate shift problem itself. To be specific, the bias and variance can be characterized by the spectral decay of a data-dependent regularized kernel: the original kernel matrix associated with an additional re-weighting matrix. We further interpret the re-weighting strategy in three ways: a nonlinear transformation of data; a distribution corrector; or a data-dependent regularization, which allows for a better understanding. Besides, our analysis provides asymptotic expansion of kernel functions/vectors under covariate shift, which has its own interest.

Live content is unavailable. Log in and register to view live content