Skip to yearly menu bar Skip to main content


Poster

Conformal Prediction for Deep Classifier via Label Ranking

Jianguo Huang · HuaJun Xi · Linjun Zhang · Huaxiu Yao · Yue Qiu · Hongxin Wei


Abstract: Conformal prediction is a statistical framework that generates prediction sets containing ground-truth labels with a desired coverage guarantee. The predicted probabilities produced by machine learning models are generally miscalibrated, leading to large prediction sets in conformal prediction. To address this issue, we propose a novel algorithm named $\textit{Sorted Adaptive Prediction Sets}$ (SAPS), which discards all the probability values except for the maximum softmax probability. The key idea behind SAPS is to minimize the dependence of the non-conformity score on the probability values while retaining the uncertainty information. In this manner, SAPS can produce compact prediction sets and communicate instance-wise uncertainty. Theoretically, we show that the expected value of set size from SAPS is asymptotically equivalent to that of APS without probability value. Extensive experiments validate that SAPS not only lessens the prediction sets but also broadly enhances the conditional coverage rate of prediction sets.

Live content is unavailable. Log in and register to view live content