Skip to yearly menu bar Skip to main content


Poster

Federated Neuro-Symbolic Learning

Pengwei Xing · Songtao Lu · Han Yu


Abstract:

Neuro-symbolic learning (NSL) models complex symbolic rule patterns into latent variable distributions by neural networks, which reduces rule search space and generates unseen rules to improve downstream task performance. Centralized NSL learning involves directly acquiring data from downstream tasks, which is not feasible for federated learning (FL). To address this limitation, we shift the focus from such a one-to-one interactive neuro-symbolic paradigm to one-to-many privacy-preserving Federated Neuro-Symbolic Learning framework (FedNSL) with latent variables as the FL communication medium.Built on the basis of our novel reformulation of the NSL theory, FedNSL is capable of identifying and addressing rule distribution heterogeneity through a simple and effective Kullback-Leibler (KL) divergence constraint on rule distribution applicable under the FL setting. It further theoretically adjust variational expectation maximization (V-EM) to reduce the rule search space across domains. This is the first time that distribution-coupled bilevel optimization has been applied to federated learning. Extensive experiments based on both synthetic and real-world data demonstrate significant advantages of FedNSL compared to five state-of-the-art methods. It outperforms the best baseline by 17% and 29% in terms of unbalance average training accuracy and unseen average testing accuracy, respectively.

Live content is unavailable. Log in and register to view live content