ToProVAR: Efficient Visual Autoregressive Modeling via Tri-Dimensional Entropy-Aware Semantic Analysis and Sparsity Optimization

📅 2026-02-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the significant efficiency bottleneck in the late-stage generation of visual autoregressive (VAR) models. The authors propose a three-dimensional sparsity modeling approach—spanning tokens, layers, and scales—based on attention entropy to identify and exploit fine-grained semantic sparsity patterns. Unlike conventional heuristic skipping strategies, this method dynamically leverages multi-dimensional sparsity for accelerated inference without compromising generation quality. Evaluated on Infinity-2B and Infinity-8B models, the approach achieves up to 3.4× speedup while preserving high-fidelity semantic details, substantially outperforming existing techniques.

Technology Category

Application Category

📝 Abstract
Visual Autoregressive(VAR) models enhance generation quality but face a critical efficiency bottleneck in later stages. In this paper, we present a novel optimization framework for VAR models that fundamentally differs from prior approaches such as FastVAR and SkipVAR. Instead of relying on heuristic skipping strategies, our method leverages attention entropy to characterize the semantic projections across different dimensions of the model architecture. This enables precise identification of parameter dynamics under varying token granularity levels, semantic scopes, and generation scales. Building on this analysis, we further uncover sparsity patterns along three critical dimensions-token, layer, and scale-and propose a set of fine-grained optimization strategies tailored to these patterns. Extensive evaluation demonstrates that our approach achieves aggressive acceleration of the generation process while significantly preserving semantic fidelity and fine details, outperforming traditional methods in both efficiency and quality. Experiments on Infinity-2B and Infinity-8B models demonstrate that ToProVAR achieves up to 3.4x acceleration with minimal quality loss, effectively mitigating the issues found in prior work. Our code will be made publicly available.
Problem

Research questions and friction points this paper is trying to address.

Visual Autoregressive
efficiency bottleneck
generation acceleration
semantic fidelity
sparsity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Visual Autoregressive
Entropy-Aware Analysis
Sparsity Optimization
Tri-Dimensional Sparsity
Efficient Generation
🔎 Similar Papers
2024-03-04Computer Vision and Pattern RecognitionCitations: 3
Jiayu Chen
Jiayu Chen
PhD student, IFLab@PKU
Efficient Visual GenerationML system
R
Ruoyu Lin
School of Electronics Engineering and Computer Science, Peking University
Zihao Zheng
Zihao Zheng
Peking University
Machine Learning SystemEdge ComputingComputer ArchitectureEDA
J
Jingxin Li
School of Electronics Engineering and Computer Science, Peking University
M
Maoliang Li
School of Computer Science, Peking University
Guojie Luo
Guojie Luo
Peking University
Electronic Design AutomationReconfigurable Architecture
X
Xiang Chen
School of Computer Science, Peking University