Skip to yearly menu bar Skip to main content


Poster

Accelerating Heterogeneous Federated Learning with Closed-form Classifiers

Eros Fanì · Raffaello Camoriano · Barbara Caputo · Marco Ciccone


Abstract:

Federated Learning (FL) methods often struggle in highly statistically heterogeneous settings, resulting in client drift due to biased local solutions. This phenomenon is particularly pronounced in the final classification layer, negatively impacting convergence speed and accuracy. To tackle this issue, we introduce Federated Recursive Ridge Regression (Fed3R). Our method fits a Ridge Regression-based classifier computed in closed form exploiting pre-trained features. Fed3R is immune to statistical heterogeneity and invariant to the sampling order of the clients. Therefore, it proves particularly effective in extreme cross-device scenarios. Furthermore, it is fast and cheap in terms of communication and computation costs, requiring up to two orders of magnitude fewer resources than the competitors. Finally, we propose to leverage the Fed3R classifier parameters as an initialization for the softmax classifier and subsequently fine-tune the model using another FL algorithm (Fed3R with Fine-Tuning, Fed3R-FT). Our findings also indicate that maintaining a fixed classifier aids in stabilizing the training and learning more discriminative features in extreme cross-device settings.

Live content is unavailable. Log in and register to view live content