Adaptive Visual Autoregressive Acceleration via Dual-Linkage Entropy Analysis

📅 2026-02-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the high computational cost of visual autoregressive models arising from processing massive token sequences. Existing acceleration methods are limited by heuristic phase partitioning, non-adaptive scheduling, and coarse-grained token pruning. To overcome these limitations, we propose NOVA, a novel framework that introduces, for the first time, the inflection point of scale-wise entropy growth as a dynamic evolution indicator, enabling adaptive, fine-grained token pruning without any additional training. NOVA employs a dual-coupling mechanism across scales and layers to dynamically adjust pruning ratios per layer, combined with low-entropy token removal and cross-scale residual cache reuse. Experiments demonstrate that NOVA achieves highly efficient, training-free acceleration of visual autoregressive generation while preserving output quality.

Technology Category

Application Category

📝 Abstract
Visual AutoRegressive modeling (VAR) suffers from substantial computational cost due to the massive token count involved. Failing to account for the continuous evolution of modeling dynamics, existing VAR token reduction methods face three key limitations: heuristic stage partition, non-adaptive schedules, and limited acceleration scope, thereby leaving significant acceleration potential untapped. Since entropy variation intrinsically reflects the transition of predictive uncertainty, it offers a principled measure to capture modeling dynamics evolution. Therefore, we propose NOVA, a training-free token reduction acceleration framework for VAR models via entropy analysis. NOVA adaptively determines the acceleration activation scale during inference by online identifying the inflection point of scale entropy growth. Through scale-linkage and layer-linkage ratio adjustment, NOVA dynamically computes distinct token reduction ratios for each scale and layer, pruning low-entropy tokens while reusing the cache derived from the residuals at the prior scale to accelerate inference and maintain generation quality. Extensive experiments and analyses validate NOVA as a simple yet effective training-free acceleration framework.
Problem

Research questions and friction points this paper is trying to address.

Visual AutoRegressive
token reduction
computational cost
modeling dynamics
acceleration
Innovation

Methods, ideas, or system contributions that make the work stand out.

token reduction
entropy analysis
visual autoregressive modeling
inference acceleration
adaptive pruning
🔎 Similar Papers
2024-03-04Computer Vision and Pattern RecognitionCitations: 3