Skip to yearly menu bar Skip to main content


Poster

SPARSE COCKTAIL: EVERY SPARSE PATTERN EVERY SPARSE RATIO ALL AT ONCE

Zhangheng Li · Shiwei Liu · Tianlong Chen · Ajay Jaiswal · Zhenyu Zhang · Dilin Wang · Raghuraman Krishnamoorthi · Shiyu Chang · Zhangyang “Atlas” Wang


Abstract:

Sparse Neural Networks (SNNs) have received voluminous attention for mitigating the explosion in computational costs and memory footprints of modern deep neural networks. Despite their popularity, most state-of-the-art training approaches seek to find a single high-quality sparse subnetwork with a preset sparsity pattern and ratio, making them inadequate to satiate platform and resource variability. Recently proposed approaches attempt to jointly train multiple subnetworks (we term as ``sparse co-training") with a \ul{fixed sparsity pattern}, to allow switching sparsity ratios subject to resource requirements. In this work, we take one more step forward and expand the scope of sparse co-training to cover \underline{diverse sparsity patterns} and \underline{multiple sparsity ratios} \textit{at once}. We introduce \textbf{Sparse Cocktail}, the \underline{first} sparse co-training framework that co-trains a suite of sparsity patterns simultaneously, loaded with multiple sparsity ratios which facilitate harmonious switch across various sparsity patterns and ratios at inference depending on the hardware availability. More specifically, Sparse Cocktail alternatively trains subnetworks generated from different sparsity patterns with a gradual increase in sparsity ratios across patterns and relies on an \textit{unified mask generation process} and the \textit{Dense Pivot Co-training} to ensure the subnetworks of different patterns orchestrate their shared parameters without canceling each other’s performance. Experiment results on image classification, object detection, and instance segmentation illustrate the favorable effectiveness and flexibility of Sparse Cocktail, pointing to a promising direction for sparse co-training. Codes will be released.

Live content is unavailable. Log in and register to view live content