Abstract
Accurate multi-label image classification is essential for real-world applications, especially in scenarios with long-tailed class distributions, where some classes appear frequently while others are rare. This imbalance often leads to biased models that struggle to accurately recognize underrepresented classes. Existing methods either trade off performance between head and tail classes or rely on image captions, limiting adaptability. To address these limitations, we propose LM-CLIP, a novel framework built around a unified loss function. Our Balanced Asymmetric Loss (BAL) extends traditional asymmetric loss by emphasizing the gradients of rare positive samples where the model is uncertain, mitigating bias toward dominant classes. This is complemented by a contrastive loss that pushes negative samples further from the decision boundary, creating a more optimal embedding space even in long-tailed scenarios. These loss functions together ensure balanced performance across all classes. Our framework is built on pre-trained models utilizing textual and visual features from millions of image-text pairs. Furthermore, we incorporate a dynamic sampling strategy that prioritizes rare classes based on their occurrence, which ensures effective training without compromising overall performance. Experiments conducted on VOC-MLT and COCO-MLT benchmarks demonstrate the effectiveness of our approach, achieving +4.66% and +8.14% improvements in mean Average Precision (mAP) over state-of-the-art methods.
| Original language | English |
|---|---|
| Pages (from-to) | 71053-71065 |
| Number of pages | 13 |
| Journal | IEEE Access |
| Volume | 13 |
| DOIs | |
| State | Published - 2025 |
Keywords
- CLIP
- Long-tailed learning
- asymmetric loss
- balanced asymmetric loss
- class imbalance
- contrastive learning
- imbalanced sampling
- loss functions
- multi-label classification
- vision-language models
Fingerprint
Dive into the research topics of 'LM-CLIP: Adapting Positive Asymmetric Loss for Long-Tailed Multi-Label Classification'. Together they form a unique fingerprint.Press/Media
-
Researchers at Dongguk University Release New Data on Engineering (LM-CLIP: Adapting Positive Asymmetric Loss for Long-Tailed Multi-Label Classification)
9/05/25
1 item of Media coverage
Press/Media