User-in-the-Loop View Sampling with Error Peaking Visualization

📅 2025-06-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing AR novel-view synthesis methods rely on manual 3D annotations to align captured images, imposing high cognitive load on users and—due to fundamental sampling-theoretic constraints—supporting only small, pre-defined sampling regions, severely limiting scene exploration flexibility. This paper proposes an active补采 framework based on error-peak visualization: it estimates local light-field reconstruction error in real time and highlights high-error regions on a mobile AR interface, enabling intuitive, annotation-free acquisition of critical viewpoints. The method breaks conventional sampling limitations, enabling large-scale, flexible exploration and efficient sparse sampling. Experiments demonstrate that our approach reduces the required number of views by 32% on average compared to baseline methods, significantly alleviates user frustration, and improves both rendering quality and convergence efficiency in mobile novel-view synthesis and large-scale radiance field reconstruction (e.g., 3D Gaussian Splatting).

Technology Category

Application Category

📝 Abstract
Augmented reality (AR) provides ways to visualize missing view samples for novel view synthesis. Existing approaches present 3D annotations for new view samples and task users with taking images by aligning the AR display. This data collection task is known to be mentally demanding and limits capture areas to pre-defined small areas due to the ideal but restrictive underlying sampling theory. To free users from 3D annotations and limited scene exploration, we propose using locally reconstructed light fields and visualizing errors to be removed by inserting new views. Our results show that the error-peaking visualization is less invasive, reduces disappointment in final results, and is satisfactory with fewer view samples in our mobile view synthesis system. We also show that our approach can contribute to recent radiance field reconstruction for larger scenes, such as 3D Gaussian splatting.
Problem

Research questions and friction points this paper is trying to address.

Reduces mental demand in AR view sampling
Expands capture areas beyond pre-defined limits
Improves view synthesis with error visualization
Innovation

Methods, ideas, or system contributions that make the work stand out.

User-in-the-Loop view sampling with visualization
Locally reconstructed light fields for error removal
Error-peaking visualization reduces needed view samples
🔎 Similar Papers
No similar papers found.