Skip to yearly menu bar Skip to main content


Poster

A Doubly-Recursive Stochastic Compositional Gradient Descent Method for Federated Multi-Level Compositional Optimization

Hongchang Gao


Abstract:

Federated compositional optimization has been actively studied in the past few years. However, the existing methods mainly focus on the two-level compositional optimization problem, which cannot be directly applied to the multi-level counterparts. Moreover, the existing federated two-level compositional optimization learning algorithm's convergence rate fails to achieve linear speedup with respect to the number of workers under the heterogeneous setting. After identifying the reason for this failure, we developed a novel federated stochastic multi-level compositional optimization algorithm by introducing a novel Jacobian-vector product estimator, which can mitigate the heterogeneity issue and communication efficiency issue simultaneously. Then, we theoretically proved that our algorithm can achieve the level-independent and linear speedup convergence rate for nonconvex problems. To the best of our knowledge, this is the first time that a federated learning algorithm can achieve this favorable convergence rate for multi-level compositional problems. Moreover, the experimental results confirm the efficacy of our algorithm.

Live content is unavailable. Log in and register to view live content