Skip to yearly menu bar Skip to main content


Poster

Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems

Filip Hanzely · Dmitry Kovalev · Peter Richtarik

Keywords: [ Optimization - Convex ] [ Supervised Learning ] [ Large Scale Learning and Big Data ] [ Convex Optimization ]


Abstract:

We propose an accelerated version of stochastic variance reduced coordinate descent -- ASVRCD. As other variance reduced coordinate descent methods such as SEGA or SVRCD, our method can deal with problems that include a non-separable and non-smooth regularizer, while accessing a random block of partial derivatives in each iteration only. However, ASVRCD incorporates Nesterov's momentum, which offers favorable iteration complexity guarantees over both SEGA and SVRCD. As a by-product of our theory, we show that a variant of Katyusha (Allen-Zhu, 2017) is a specific case of ASVRCD, recovering the optimal oracle complexity for the finite sum objective.

Chat is not available.