🤖 AI Summary
When reference information—such as target speaker direction of arrival (DOA) or spectral templates—is inaccurate, conventional multi-channel speaker extraction systems suffer from poor robustness. To address this, we propose a spatial-spectral dual-cue adaptive fusion method. Leveraging a deep neural network architecture, we introduce a dynamic gating mechanism to enable real-time, weighted fusion of spatial and spectral features. Furthermore, the framework integrates robust DOA estimation with noise-aware spectral registration, allowing the system to actively suppress interference from either modality when it severely degrades. This work is the first to achieve end-to-end adaptive collaboration between spatial and spectral cues. Experimental results demonstrate substantial improvements in robustness against reference mismatch: under severe DOA and spectral template deviations, the proposed method achieves a 3.2 dB gain in SI-SNR improvement (SI-SNRi) over single-modality baselines, validating its effectiveness and practicality.
📝 Abstract
This paper presents a robust multi-channel speaker extraction algorithm designed to handle inaccuracies in reference information. While existing approaches often rely solely on either spatial or spectral cues to identify the target speaker, our method integrates both sources of information to enhance robustness. A key aspect of our approach is its emphasis on stability, ensuring reliable performance even when one of the features is degraded or misleading. Given a noisy mixture and two potentially unreliable cues, a dedicated network is trained to dynamically balance their contributions-or disregard the less informative one when necessary. We evaluate the system under challenging conditions by simulating inference-time errors using a simple direction of arrival (DOA) estimator and a noisy spectral enrollment process. Experimental results demonstrate that the proposed model successfully extracts the desired speaker even in the presence of substantial reference inaccuracies.