Skip to yearly menu bar Skip to main content


Poster

Agnostic Sample Compression Schemes for Regression

Idan Attias · Steve Hanneke · Aryeh Kontorovich · Menachem Sadigurschi


Abstract: We obtain the first positive results forbounded sample compression in theagnostic regression setting with the $\ell_p$ loss, where $p\in [1,\infty]$.We construct a generic \emph{approximate} sample compression scheme for real-valued function classes exhibiting exponential size in the fat-shattering dimension but independent of the sample size.Notably, for linear regression, an \emph{approximate} compression of size linear in the dimension is constructed.Moreover, for $\ell_1$ and $\ell_\infty$ losses, we can even exhibit an efficient \emph{exact} sample compression scheme of size linear in the dimension.We further show that for every other $\ell_p$ loss, $p\in (1,\infty)$, there does not exist an exact agnostic compression scheme of bounded size. This refines and generalizes a negative result of\citet*{david2016supervised} for the $\ell_2$ loss.We close by posing general open questions: for agnostic regression with $\ell_1$ loss, does every function class admits an exact compression scheme of size equal to its pseudo-dimension? For the $\ell_2$ loss, does every function class admit an approximate compression scheme of polynomial size in the fat-shattering dimension?These questions generalize Warmuth's classic sample compression conjecture for realizable-case classification \citep{warmuth2003compressing}.

Live content is unavailable. Log in and register to view live content