Multi-Scale and Multi-Objective Optimization for Cross-Lingual Aspect-Based Sentiment Analysis

📅 2025-02-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the poor cross-lingual robustness of aspect term representations and insufficient fine-grained semantic alignment in cross-lingual Aspect-Based Sentiment Analysis (ABSA), this paper proposes a synergistic framework integrating multi-scale alignment and multi-objective optimization. Specifically, it jointly models semantic alignment at both sentence-level and aspect-level granularities; incorporates supervised loss with consistency regularization for collaborative optimization; employs code-mixed bilingual sentence augmentation to enhance discriminative robustness; and integrates target-language knowledge distillation. This work is the first to unify multi-scale alignment, consistency training, language discrimination, code-switching augmentation, and knowledge distillation within a single cross-lingual ABSA transfer paradigm. Extensive experiments across multiple language pairs and benchmark datasets demonstrate state-of-the-art performance, with significant improvements in both aspect term identification and sentiment classification accuracy.

Technology Category

Application Category

📝 Abstract
Aspect-based sentiment analysis (ABSA) is a sequence labeling task that has garnered growing research interest in multilingual contexts. However, recent studies lack more robust feature alignment and finer aspect-level alignment. In this paper, we propose a novel framework, Multi-Scale and Multi-Objective optimization (MSMO) for cross-lingual ABSA. During multi-scale alignment, we achieve cross-lingual sentence-level and aspect-level alignment, aligning features of aspect terms in different contextual environments. Specifically, we introduce code-switched bilingual sentences into the language discriminator and consistency training modules to enhance the model's robustness. During multi-objective optimization, we design two optimization objectives: supervised training and consistency training, aiming to enhance cross-lingual semantic alignment. To further improve model performance, we incorporate distilled knowledge of the target language into the model. Results show that MSMO significantly enhances cross-lingual ABSA by achieving state-of-the-art performance across multiple languages and models.
Problem

Research questions and friction points this paper is trying to address.

Enhances cross-lingual aspect-based sentiment analysis
Improves feature and aspect-level alignment
Introduces multi-scale and multi-objective optimization framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-scale feature alignment
Code-switched bilingual sentences
Distilled target language knowledge
🔎 Similar Papers
No similar papers found.
Chengyan Wu
Chengyan Wu
South China Normal University & SUN YAT-SEN University
Machine LearningNLPMultilingual IELLM AlignmentMultimodal
Bolei Ma
Bolei Ma
LMU Munich
LinguisticsNatural Language ProcessingComputational Social Science
N
Ningyuan Deng
Institute of Scientific and Technical Information of China
Y
Yanqing He
Institute of Scientific and Technical Information of China
Y
Yun Xue
Guangdong Provincial Key Laboratory of Quantum Engineering and Quantum Materials, School of Electronic Science and Engineering (School of Microelectronics), South China Normal University