Skip to yearly menu bar Skip to main content


Poster

Adaptive proximal gradient methods are universal without approximation

Konstantinos Oikonomidis · Emanuel Laude · Puya Latafat · Andreas Themelis · Panagiotis Patrinos


Abstract: We show that adaptive proximal gradient methods for convex problems are not restricted to traditional Lipschitzian assumptions. Our analysis of a class of linesearch-free methods reveals that they still work under local Hölder gradient continuity, covering in particular continuously differentiable semi-algebraic functions. To mitigate the lack of local Lipschitz continuity, popular approaches revolve around $\varepsilon$-oracles and/or linesearch procedures. In contrast, we exploit plain Hölder inequalities not entailing any approximation, all while retaining the linesearch-free nature of adaptive schemes. Furthermore, we prove full sequence convergence without prior knowledge of local Hölder constants nor of the order of Hölder continuity. In numerical experiments we present comparisons to baseline methods on diverse tasks from machine learning covering both the locally and the globally Hölder setting.

Live content is unavailable. Log in and register to view live content