Uncertainty-Aware Active Meta-Learning for Few-Shot Text Classification

Sanghyun Seo, Hiskias Dingeto, Juntae Kim

Research output: Contribution to journalArticlepeer-review

Abstract

Low-resource natural language understanding is one of the challenges in the field of language understanding. As natural language processing and natural language understanding take center stage in machine learning, these challenges need solutions more than ever. This paper introduces the technique of Uncertainty-Aware Active Meta-Learning (UA-AML), a methodology designed to enhance the efficiency of models in low-resource natural language understanding tasks. This methodology is particularly significant in the context of limited data availability, a common challenge in the field of natural language processing. Uncertainty-Aware Active Meta-Learning enables the selection of high-quality tasks from a diverse range of task data available during the learning process. By quantifying the prediction uncertainty of the model for the input data, we provide a loss function and learning strategy that can adjust the influence of the input data on the model’s learning. This approach ensures that the most relevant and informative data are utilized during the learning process, optimizing learning efficiency and model performance. We have applied this meta-learning technique to tasks in low-resource natural language understanding, such as few-shot relation classification and few-shot sentiment classification. Our experimental results, which are carried out on the Amazon Review Sentiment Classification (ARSC) and the FewRel dataset, demonstrate that this technique can construct low-resource natural language understanding models with improved performance, providing a robust solution for tasks with limited data availability. This research contributes to the expansion of meta-learning techniques beyond their traditional application in computer vision, demonstrating their potential in natural language processing. Our findings suggest that this methodology can be effectively utilized in a wider range of application areas, opening new avenues for future research in low-resource natural language understanding.

Original languageEnglish
Article number3702
JournalApplied Sciences (Switzerland)
Volume15
Issue number7
DOIs
StatePublished - Apr 2025

Keywords

  • meta-learning
  • natural language processing
  • natural language understanding
  • uncertainty quantification

Fingerprint

Dive into the research topics of 'Uncertainty-Aware Active Meta-Learning for Few-Shot Text Classification'. Together they form a unique fingerprint.

Cite this