Skip to yearly menu bar Skip to main content


Poster

FedBAT: Communication-efficient Federated Learning via Learnable Binarization

Shiwei Li · Wenchao Xu · Haozhao Wang · Xing Tang · Yining Qi · Shijie Xu · weihongluo · Yuhua Li · xiuqiang He · Ruixuan Li


Abstract:

Federated learning is a promising distributed machine learning paradigm that can effectively protect data privacy. However, it may incur significant communication overhead, thereby potentially impairing training efficiency. To address this challenge, numerous studies suggest binarizing model updates, thereby reducing communication volume by a factor of up to 32. Nonetheless, traditional methods usually binarize model updates in a post-training manner, resulting in significant approximation errors and consequent degradation in model accuracy. To this end, we propose \textbf{Federated Binarization-aware Training (FedBAT)}, a novel framework that directly learns binary model updates during the local training process, thus inherently reducing the approximation errors. FedBAT incorporates an innovative binarization operator, along with meticulously designed derivatives to facilitate efficient learning. In addition, we establish theoretical guarantees regarding the convergence of FedBAT. Extensive experiments are conducted on four popular datasets. The results show that FedBAT significantly accelerates the convergence and exceeds the accuracy of binarization methods by up to 9\%, even surpassing that of FedAvg in some cases.

Live content is unavailable. Log in and register to view live content