🤖 AI Summary
This work addresses the lack of continuous-domain higher-order cumulative integration capability in neural fields. We propose learning neural representations directly from functions to approximate arbitrary-order antiderivatives, enabling grid-free continuous integration. To this end, we design a novel neural architecture that embeds classical cumulative operators into continuous neural systems, supporting multidimensional inputs and learnable, adjustable integration orders, while unifying differential and integral operators within a single framework. To our knowledge, this is the first approach to achieve learnable, differentiable, and generalizable higher-order antiderivative representations in neural fields. Experiments demonstrate significant improvements in accuracy and generalization across neural radiance field reconstruction, antialiasing filtering, and volume rendering. Our method provides continuous neural representations with a core functionality analogous to discrete integral tables—enabling robust, parameter-efficient, and mathematically grounded integration over continuous domains.
📝 Abstract
Neural fields offer continuous, learnable representations that extend beyond traditional discrete formats in visual computing. We study the problem of learning neural representations of repeated antiderivatives directly from a function, a continuous analogue of summed-area tables. Although widely used in discrete domains, such cumulative schemes rely on grids, which prevents their applicability in continuous neural contexts. We introduce and analyze a range of neural methods for repeated integration, including both adaptations of prior work and novel designs. Our evaluation spans multiple input dimensionalities and integration orders, assessing both reconstruction quality and performance in downstream tasks such as filtering and rendering. These results enable integrating classical cumulative operators into modern neural systems and offer insights into learning tasks involving differential and integral operators.