🤖 AI Summary
To address the limitation of multilingual text-to-speech (TTS) systems caused by insufficient high-quality recorded speech data, this paper proposes an open-source speech restoration framework that efficiently converts noisy, in-the-wild speech into studio-grade clean speech. Methodologically, it employs a fine-tuned w2v-BERT 2.0 predictor to extract robust acoustic features, coupled with a lightweight, domain-specific vocoder for end-to-end speech purification. The system achieves real-time inference throughput of 3390× on a single GPU and supports zero-shot generalization across dozens of languages. Key contributions include: (1) the first fully open-source framework achieving restoration performance comparable to Google’s Miipher; and (2) successful application to clean ASR corpora curated by Sidon, enabling training of TTS models with significantly improved synthesis naturalness and intelligibility—thereby demonstrating its effectiveness and practicality for large-scale, real-world multilingual speech corpus cleaning.
📝 Abstract
Large-scale text-to-speech (TTS) systems are limited by the scarcity of clean, multilingual recordings. We introduce Sidon, a fast, open-source speech restoration model that converts noisy in-the-wild speech into studio-quality speech and scales to dozens of languages. Sidon consists of two models: w2v-BERT 2.0 finetuned feature predictor to cleanse features from noisy speech and vocoder trained to synthesize restored speech from the cleansed features. Sidon achieves restoration performance comparable to Miipher: Google's internal speech restoration model with the aim of dataset cleansing for speech synthesis. Sidon is also computationally efficient, running up to 3,390 times faster than real time on a single GPU. We further show that training a TTS model using a Sidon-cleansed automatic speech recognition corpus improves the quality of synthetic speech in a zero-shot setting. Code and model are released to facilitate reproducible dataset cleansing for the research community.