当前位置: X-MOL 学术Int. J. Comput. Vis. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Augmenting the Softmax with Additional Confidence Scores for Improved Selective Classification with Out-of-Distribution Data
International Journal of Computer Vision ( IF 19.5 ) Pub Date : 2024-04-23 , DOI: 10.1007/s11263-024-02029-3
Guoxuan Xia , Christos-Savvas Bouganis

Detecting out-of-distribution (OOD) data is a task that is receiving an increasing amount of research attention in the domain of deep learning for computer vision. However, the performance of detection methods is generally evaluated on the task in isolation, rather than also considering potential downstream tasks in tandem. In this work, we examine selective classification in the presence of OOD data (SCOD). That is to say, the motivation for detecting OOD samples is to reject them so their impact on the quality of predictions is reduced. We show under this task specification, that existing post-hoc methods perform quite differently compared to when evaluated only on OOD detection. This is because it is no longer an issue to conflate in-distribution (ID) data with OOD data if the ID data is going to be misclassified. However, the conflation within ID data of correct and incorrect predictions becomes undesirable. We also propose a novel method for SCOD, Softmax Information Retaining Combination (SIRC), that augments a softmax-based confidence score with a secondary class-agnostic feature-based score. Thus, the ability to identify OOD samples is improved without sacrificing separation between correct and incorrect ID predictions. Experiments on a wide variety of ImageNet-scale datasets and convolutional neural network architectures show that SIRC is able to consistently match or outperform the baseline for SCOD, whilst existing OOD detection methods fail to do so. Interestingly, we find that the secondary scores investigated for SIRC do not consistently improve performance on all tested OOD datasets. To address this issue, we further extend SIRC to incorporate multiple secondary scores (SIRC+). This further improves SCOD performance, both generally, and in terms of consistency over diverse distribution shifts. Code is available at https://github.com/Guoxoug/SIRC.



中文翻译:

使用额外的置信度分数增强 Softmax,以改进分布外数据的选择性分类

检测分布外(OOD)数据是一项在计算机视觉深度学习领域受到越来越多研究关注的任务。然而,检测方法的性能通常是单独评估任务,而不是同时考虑潜在的下游任务。在这项工作中,我们研究了 OOD 数据 (SCOD) 存在下的选择性分类。也就是说,检测 OOD 样本的动机是拒绝它们,从而减少它们对预测质量的影响。我们在此任务规范下表明,与仅在 OOD 检测上进行评估时相比,现有的事后方法的表现有很大不同。这是因为,如果 ID 数据会被错误分类,则将分布内 (ID) 数据与 OOD 数据混为一谈不再是问题。然而,将正确和错误预测的 ID 数据混为一谈是不受欢迎的。我们还提出了一种新的 SCOD 方法,即 Softmax 信息保留组合 (SIRC),该方法通过基于二级类别不可知特征的分数来增强基于 softmax 的置信度分数。因此,识别 OOD 样本的能力得到了提高,而不会牺牲正确和错误 ID 预测之间的分离。对各种 ImageNet 规模数据集和卷积神经网络架构的实验表明,SIRC 能够始终匹配或优于 SCOD 的基线,而现有的 OOD 检测方法则无法做到这一点。有趣的是,我们发现针对 SIRC 调查的二级分数并不能始终如一地提高所有测试的 OOD 数据集的性能。为了解决这个问题,我们进一步扩展 SIRC 以纳入多个二级分数 (SIRC+)。这进一步提高了 SCOD 性能,无论是总体上还是在不同分布变化的一致性方面。代码可在 https://github.com/Guoxoug/SIRC 获取。

更新日期:2024-04-23
down
wechat
bug