PointNSP: Autoregressive 3D Point Cloud Generation with Next-Scale Level-of-Detail Prediction

📅 2025-10-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Autoregressive point cloud generation has long suffered from modeling long-range dependencies due to hand-crafted ordering, hindering global structural fidelity (e.g., symmetry, topological consistency). To address this, we propose PointNSP, a multi-scale autoregressive framework based on next-scale detail prediction. It adopts a coarse-to-fine progressive generation paradigm that aligns autoregression with point-set permutation invariance. Crucially, it introduces hierarchical detail prediction and cross-scale contextual interaction to explicitly enforce global geometric constraints. On ShapeNet, PointNSP achieves state-of-the-art autoregressive performance for dense 8,192-point generation—surpassing leading diffusion models for the first time—while requiring fewer parameters and significantly lower training and inference overhead.

Technology Category

Application Category

📝 Abstract
Autoregressive point cloud generation has long lagged behind diffusion-based approaches in quality. The performance gap stems from the fact that autoregressive models impose an artificial ordering on inherently unordered point sets, forcing shape generation to proceed as a sequence of local predictions. This sequential bias emphasizes short-range continuity but undermines the model's capacity to capture long-range dependencies, hindering its ability to enforce global structural properties such as symmetry, consistent topology, and large-scale geometric regularities. Inspired by the level-of-detail (LOD) principle in shape modeling, we propose PointNSP, a coarse-to-fine generative framework that preserves global shape structure at low resolutions and progressively refines fine-grained geometry at higher scales through a next-scale prediction paradigm. This multi-scale factorization aligns the autoregressive objective with the permutation-invariant nature of point sets, enabling rich intra-scale interactions while avoiding brittle fixed orderings. Experiments on ShapeNet show that PointNSP establishes state-of-the-art (SOTA) generation quality for the first time within the autoregressive paradigm. In addition, it surpasses strong diffusion-based baselines in parameter, training, and inference efficiency. Finally, in dense generation with 8,192 points, PointNSP's advantages become even more pronounced, underscoring its scalability potential.
Problem

Research questions and friction points this paper is trying to address.

Addresses autoregressive point cloud generation quality gap
Overcomes sequential bias hindering global structural properties
Enables coarse-to-fine generation preserving global shape structure
Innovation

Methods, ideas, or system contributions that make the work stand out.

Coarse-to-fine generative framework with multi-scale factorization
Next-scale prediction paradigm for progressive detail refinement
Aligns autoregressive objective with permutation-invariant point sets
🔎 Similar Papers
No similar papers found.
Ziqiao Meng
Ziqiao Meng
National University of Singapore, CUHK
Generative ModelingAI for ScienceGeometric Deep Learning
Q
Qichao Wang
Nanyang Technological University
Z
Zhiyang Dou
University of Hong Kong
Zixing Song
Zixing Song
University of Bristol
Z
Zhipeng Zhou
Nanyang Technological University
Irwin King
Irwin King
The Chinese University of Hong Kong
social computingmachine learningAIgraph neural networksNLP
P
Peilin Zhao
Shanghai Jiao Tong University