Enhancing BERT Fine-Tuning for Sentiment Analysis in Lower-Resourced Languages

📅 2025-12-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the performance degradation of BERT fine-tuning for sentiment analysis in low-resource languages (Slovak, Maltese, Icelandic, Turkish) due to severe scarcity of labeled data, this paper proposes a dynamic sample scheduling framework integrating active learning with hierarchical data clustering. We introduce an “active learning scheduler” that leverages cluster structure to guide iterative, informative sample selection under strict annotation budget constraints, thereby enhancing data utilization efficiency without compromising model stability. Experimental results demonstrate that our approach reduces human annotation effort by 30% on average across the four target languages while improving F1 scores by up to 4.0 percentage points. These findings validate that structural awareness in dynamic scheduling effectively alleviates the data bottleneck in low-resource settings, offering a scalable, lightweight optimization paradigm for cross-lingual sentiment analysis.

Technology Category

Application Category

📝 Abstract
Limited data for low-resource languages typically yield weaker language models (LMs). Since pre-training is compute-intensive, it is more pragmatic to target improvements during fine-tuning. In this work, we examine the use of Active Learning (AL) methods augmented by structured data selection strategies which we term 'Active Learning schedulers', to boost the fine-tuning process with a limited amount of training data. We connect the AL to data clustering and propose an integrated fine-tuning pipeline that systematically combines AL, clustering, and dynamic data selection schedulers to enhance model's performance. Experiments in the Slovak, Maltese, Icelandic and Turkish languages show that the use of clustering during the fine-tuning phase together with AL scheduling can simultaneously produce annotation savings up to 30% and performance improvements up to four F1 score points, while also providing better fine-tuning stability.
Problem

Research questions and friction points this paper is trying to address.

Improves BERT fine-tuning for low-resource language sentiment analysis.
Uses active learning and clustering to reduce annotation needs.
Enhances model performance and stability with limited training data.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Active Learning with structured data selection schedulers
Integrated pipeline combining AL, clustering, dynamic selection
Clustering during fine-tuning with AL scheduling for efficiency
🔎 Similar Papers
No similar papers found.
J
Jozef Kubík
Faculty of Mathematics Physics and Informatics, Comenius University in Bratislava, Slovakia
Marek Šuppa
Marek Šuppa
Comenius University in Bratislava
Natural Language ProcessingComputer VisionMachine Learning
M
Martin Takáč
Faculty of Mathematics Physics and Informatics, Comenius University in Bratislava, Slovakia