🤖 AI Summary
This work addresses the challenge of micro-expression recognition, which is hindered by their extremely short duration and low intensity, making it difficult for conventional optical flow methods to capture discriminative features effectively. To overcome this limitation, the authors propose a dual-branch feature extraction network that integrates residual structures to mitigate gradient vanishing and Inception modules to enhance multi-scale representation capabilities. Furthermore, a parallel attention mechanism combined with an adaptive feature fusion module is introduced to efficiently integrate multi-source features. Evaluated on the CASME II dataset, the proposed method achieves a recognition accuracy of 74.67%, significantly outperforming LBP-TOP by 11.26% and MSMMT by 3.36%, thereby demonstrating improved robustness and accuracy in micro-expression recognition.
📝 Abstract
Micro-expressions, characterized by transience and subtlety, pose challenges to existing optical flow-based recognition methods. To address this, this paper proposes a dual-branch micro-expression feature extraction network integrated with parallel attention. Key contributions include: 1) a residual network designed to alleviate gradient anishing and network degradation; 2) an Inception network constructed to enhance model representation and suppress interference from irrelevant regions; 3) an adaptive feature fusion module developed to integrate dual-branch features. Experiments on the CASME II dataset demonstrate that the proposed method achieves 74.67% accuracy, outperforming LBP-TOP (by 11.26%), MSMMT (by 3.36%), and other comparative methods.