Skip to yearly menu bar Skip to main content


Poster

Disentangled Graph Self-supervised Learning under Distribution Shifts

Haoyang Li · Xin Wang · Zeyang Zhang · Haibo Chen · Ziwei Zhang · Wenwu Zhu


Abstract:

Graph out-of-distribution (OOD) generalization, aiming to generalize graph neural networks (GNNs) under distribution shifts between training and testing environments, has gained increasing significance recently. However, existing literature heavily relies on sufficient task-dependent graph labels, which are often scarce or even unavailable, limiting their applications in real-world scenarios. In this paper, we study self-supervised graph OOD generalization problem, \ie, learning GNNs capable of achieving relatively stable performances under distribution shifts without graph labels. However, the problem remains largely unexplored in literature, with the following critical challenge that the invariant and variant information are highly entangled in the graphs. To solve this problem, we propose an OOD generalized disentangled graph contrastive learning model (\modelnosp), which is capable of learning disentangled graph-level representations with self-supervision that can handle distribution shifts between training and testing graph data. Specifically, we first design a disentangled graph encoder to map each input graph into the factorized graph representation. Then we propose a tailored disentangled invariant self-supervised learning module tomaximize predictive ability of the representations and make sure the representations other than one specific channel are invariant to this latent factor for excluding the information to this latent factor for disentanglement. We provide comprehensive theoretical analyses to show that our model can learn disentangled graph representations and achieve OOD generalization. Extensive experiments on real-world datasets demonstrate the superiority of our model against state-of-the-art baselines under distribution shifts for graph classification tasks.

Live content is unavailable. Log in and register to view live content