Skip to yearly menu bar Skip to main content


Poster

Distribution Alignment Optimization through Neural Collapse for Long-tailed Classification

Jintong Gao · He Zhao · Dandan Guo · Hongyuan Zha


Abstract:

A well-trained deep neural network on balanced datasets usually exhibits the Neural Collapse (NC) phenomenon. However, a model trained on long-tailed datasets can hardly achieve the NC phenomenon, partially responsible for the deteriorated performance of test data. Recent works enforce a model on long-tailed datasets to satisfy NC so that it may achieve better performance. This work aims to induce the NC phenomenon in imbalanced learning from the perspective of distribution matching. By enforcing the distribution of last-layer representations to align the ideal distribution of the ETF structure, we develop a Distribution Alignment Optimization (DisA) loss. Since our plug-and-play method can be combined with most of the existing long-tailed methods, we further instantiate it to the cases of fixing classifier and learning classifier. The extensive experiments show that DisA is effective in many cases, providing a promising solution to the imbalanced issue.

Live content is unavailable. Log in and register to view live content