Enhancing Confidence Calibration in Long-Tailed Recognition
No Thumbnail Available
Date
2024-06
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Indian Statistical Institute, Kolkata
Abstract
Deep neural networks often struggle with heavily class-imbalanced training datasets.
Recently, two-stage methods have been developed to separate representation learning
from classifier learning, aiming to enhance performance. However, the crucial
issue of miscalibration remains. To tackle this, we introduce novel methods to
improve both calibration and performance in such scenarios.
Recognizing that predicted probability distributions of classes are closely tied
to the number of class instances, we propose label-aware smoothing with balanced
softmax. This strategy tackles the issue of differing levels of over-confidence among
different categories, thereby improving the learning process of classifiers. Furthermore,
to counteract potential bias in the dataset between the two stages caused by
different sampling techniques, we incorporate shifted batch normalization into the
decoupling framework.
The methods we suggest have set fresh standards on numerous prevalent longtailed
recognition datasets such as CIFAR10-LT and CIFAR100-LT.
Description
Dissertation under the supervision of Dr. Swagatam Das
Keywords
Classification, Class imbalance, Balance Softmax, Miscalibration.
Citation
52p.
