🤖 AI Summary
Connectionist Temporal Classification (CTC) theoretically assumes conditional independence among output labels, yet empirical evidence suggests that strong encoders implicitly learn context-dependent internal language models (ILMs). Method: This work formally models and empirically validates this implicit contextual dependency for the first time; proposes a label-level knowledge distillation framework to unsupervisedly extract context-aware ILMs from CTC decoders; and introduces two novel strategies—smooth regularization and label-level contextual modeling—to relax the conventional independent-label assumption. Contribution/Results: On the cross-domain TED-LIUM benchmark, the proposed approach reduces word error rate (WER) by over 13% compared to shallow fusion, significantly outperforming context-agnostic priors. These results robustly demonstrate CTC’s capacity to learn strong implicit ILMs, challenging the standard independence assumption and enabling more effective integration of contextual knowledge in end-to-end speech recognition.
📝 Abstract
Although connectionist temporal classification (CTC) has the label context independence assumption, it can still implicitly learn a context-dependent internal language model (ILM) due to modern powerful encoders. In this work, we investigate the implicit context dependency modeled in the ILM of CTC. To this end, we propose novel context-dependent ILM estimation methods for CTC based on knowledge distillation (KD) with theoretical justifications. Furthermore, we introduce two regularization methods for KD. We conduct experiments on Librispeech and TED-LIUM Release 2 datasets for in-domain and cross-domain evaluation, respectively. Experimental results show that context-dependent ILMs outperform the context-independent priors in cross-domain evaluation, indicating that CTC learns a context-dependent ILM. The proposed label-level KD with smoothing method surpasses other ILM estimation approaches, with more than 13% relative improvement in word error rate compared to shallow fusion.