Skip to yearly menu bar Skip to main content


Poster

Enhancing Class-Imbalanced Learning with Pre-trained Guidance through Class-Conditional Knowledge Distillation

Lan Li · Xin-Chun Li · Han-Jia Ye · De-Chuan Zhan


Abstract: In class-imbalanced learning, the scarcity of information on minority classes presents challenges in obtaining generalizable features for these classes. Leveraging large-scale pre-trained models with powerful generalization capabilities as teacher models can be employed to fill the information gap. Knowledge distillation transfers the knowledge of the teacher model by learning the label distribution $p(\boldsymbol{y}|\boldsymbol{x})$ predicted by the teacher model. However, on imbalanced data, this method falls short in capturing the teacher model's knowledge about the class-conditional probability distribution $p(\boldsymbol{x}|\boldsymbol{y})$, which is crucial for enhancing generalization. Therefore, we propose Class-Conditional Knowledge Distillation (CCKD), directly learning the teacher model's class-conditional probability distribution by minimizing the KL divergence between the $p(\boldsymbol{x}|\boldsymbol{y})$ of the student and teacher model. we further present Augmented CCKD (ACCKD), which includes distillation on the constructed class-balanced data (formed through data mixing in training samples) and feature imitation on the entire dataset to further facilitate the learning of $p(\boldsymbol{x}|\boldsymbol{y})$. Results from experiments on various imbalanced datasets show an average accuracy enhancement of 7.5\% with the application of our method.

Live content is unavailable. Log in and register to view live content