Mitigating Premature Discretization with Progressive Quantization for Robust Vector Tokenization

πŸ“… 2026-03-17
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing vector quantization methods suffer from premature discretization, which hinders the encoder’s ability to fully learn the underlying data manifold and limits representational capacity. To address this, this work proposes Progressive Vector Quantization (ProVQ), which, for the first time, formulates the quantization process as a learnable curriculum schedule. By employing a smooth annealing strategy that transitions gradually from continuous to discrete representations, ProVQ dynamically modulates quantization difficulty to guide the codebook toward progressively covering the data manifold. This approach significantly enhances alignment between the codebook and the data distribution, leading to improved reconstruction and generative performance on ImageNet-1K and ImageNet-100, and achieving state-of-the-art results on the StrutTokenBench protein structure tokenization benchmark.

Technology Category

Application Category

πŸ“ Abstract
Vector Quantization (VQ) has become the cornerstone of tokenization for many multimodal Large Language Models and diffusion synthesis. However, existing VQ paradigms suffer from a fundamental conflict: they enforce discretization before the encoder has captured the underlying data manifold. We term this phenomenon Premature Discretization. To resolve this, we propose Progressive Quantization (ProVQ), which incorporates the dynamics of quantization hardness as a fundamental yet previously overlooked axis in VQ training. By treating quantization as a curriculum that smoothly anneals from a continuous latent space to a discrete one, ProVQ effectively guides the codebook toward the well-expanded manifolds. Extensive experimental results demonstrate the broad effectiveness of ProVQ across diverse modalities. We report improved reconstruction and generative performance on the ImageNet-1K and ImageNet-100 benchmarks, highlighting the ProVQ's boost for generative modeling. Furthermore, ProVQ proves highly effective for modeling complex biological sequences, establishing a new performance ceiling for protein structure tokenization on the StrutTokenBench leaderboard.
Problem

Research questions and friction points this paper is trying to address.

Premature Discretization
Vector Quantization
Tokenization
Discretization
Latent Space
Innovation

Methods, ideas, or system contributions that make the work stand out.

Progressive Quantization
Vector Quantization
Premature Discretization
Curriculum Learning
Multimodal Tokenization
πŸ”Ž Similar Papers
No similar papers found.
W
Wenhao Zhao
National University of Singapore, Singapore
Qiran Zou
Qiran Zou
NUS | THU
Computer VisionMachine Learning
Z
Zhouhan Lin
Shanghai Jiao Tong University, Shanghai, China
Dianbo Liu
Dianbo Liu
Assistant professor, National University of Singapore
Push the limits of humanmachine learningbiomedical sciences