🤖 AI Summary
This study addresses the challenge of prosodic prominence detection in Austrian German conversational speech by proposing the first end-to-end salience-aware automatic speech recognition (ASR) framework, jointly performing word-level transcription and prosodic prominence classification. Methodologically, we fine-tune wav2vec 2.0 to construct a large-scale prosodically annotated corpus and design a multi-task Transformer model that jointly models ASR and prominence detection with shared parameters and co-optimization. Our key contribution is the first unified end-to-end training of ASR and prominence prediction within a Transformer architecture—eliminating cascaded pipelines or post-hoc processing. Experiments demonstrate that the model achieves 85.53% accuracy in prominence detection while maintaining baseline ASR performance, confirming its capacity to effectively encode prosodic salience information from conversational speech. This work establishes a novel paradigm for prosody-aware speech understanding in low-resource dialects.
📝 Abstract
This paper investigates prominence-aware automatic speech recognition (ASR) by combining prominence detection and speech recognition for conversational Austrian German. First, prominence detectors were developed by fine-tuning wav2vec2 models to classify word-level prominence. The detector was then used to automatically annotate prosodic prominence in a large corpus. Based on those annotations, we trained novel prominence-aware ASR systems that simultaneously transcribe words and their prominence levels. The integration of prominence information did not change performance compared to our baseline ASR system, while reaching a prominence detection accuracy of 85.53% for utterances where the recognized word sequence was correct. This paper shows that transformer-based models can effectively encode prosodic information and represents a novel contribution to prosody-enhanced ASR, with potential applications for linguistic research and prosody-informed dialogue systems.