Skip to yearly menu bar Skip to main content


Poster

SCoRe: Submodular Combinatorial Representation Learning

Anay Majee · Suraj Kothawade · Krishnateja Killamsetty · Rishabh Iyer


Abstract:

In this paper we introduce the SCoRe (Submodular Combinatorial Representation Learning) framework, a novel approach in machine vision representation learning that addresses inter-class bias and intra-class variance. SCoRe embodies a paradigm shift in representation learning, and introduces a new family of objective functions based on set-based submodular information measures as objective functions, leveraging their inherent combinatorial properties to counter imbalances in real-world datasets. Crucially, SCoRe not only introduces novel combinatorial objectives but also generalizes to existing ones in metric/contrastive learning. Objectives like N-pairs loss and Orthogonal projection loss are inherently instances of SCoRe, whereas others can be effectively re-formulated to instantiate SCoRe, underlining the versatility and applicability of SCoRe in a broad spectrum of learning scenarios. Empirically, SCoRe achieves significant performance gains, with up to 7.6% improvement in classification on CIFAR-10 and MedMNIST, 2.1% on ImageNet-LT, and 19.4% in object detection on IDD and LVIS (v1.0), demonstrating its generalizability and effectiveness over existing approaches.

Live content is unavailable. Log in and register to view live content