🤖 AI Summary
NeRF training suffers from excessive computational and memory overhead due to dense volume rendering, hindering practical deployment. This paper identifies a long-tailed distribution linking NeRF reconstruction error to image content, and proposes an extended supervision paradigm: forward rendering and supervision are applied only to adaptively selected sparse key pixels, while full-image error is estimated via error distribution modeling and spatially consistent extrapolation. Breaking away from conventional per-pixel supervision, our method is compatible with mainstream acceleration frameworks—including Mip-NeRF and Instant-NGP. Extensive evaluation across multiple complex-scene datasets demonstrates that, without compromising visual quality, the approach achieves 52% memory reduction and 16% faster training—significantly improving training efficiency.
📝 Abstract
Neural Radiance Field (NeRF) has achieved remarkable success in creating immersive media representations through its exceptional reconstruction capabilities. However, the computational demands of dense forward passes and volume rendering during training continue to challenge its real-world applications. In this paper, we introduce Expansive Supervision to reduce time and memory costs during NeRF training from the perspective of partial ray selection for supervision. Specifically, we observe that training errors exhibit a long-tail distribution correlated with image content. Based on this observation, our method selectively renders a small but crucial subset of pixels and expands their values to estimate errors across the entire area for each iteration. Compared to conventional supervision, our approach effectively bypasses redundant rendering processes, resulting in substantial reductions in both time and memory consumption. Experimental results demonstrate that integrating Expansive Supervision within existing state-of-the-art acceleration frameworks achieves 52% memory savings and 16% time savings while maintaining comparable visual quality.