🤖 AI Summary
This work addresses the challenge in inverse rendering where higher-order optimization is hindered by the computational intractability of second-order derivatives of rendering operators. We present the first systematic derivation and implementation of the Hessian matrices—and their vector products—for both rasterization and path tracing renderers. To improve accuracy and efficiency in higher-order derivative estimation, we propose a multidimensional joint importance sampling strategy integrated with convolutional differential modeling. Building upon this, we design an inverse rendering optimizer that combines Newton’s method with the conjugate gradient method. Experiments demonstrate that, compared to first-order gradient descent, our approach accelerates convergence by 2–5× across multiple inverse rendering tasks, while simultaneously enhancing optimization stability and parameter reconstruction fidelity.
📝 Abstract
We derive methods to compute higher order differentials (Hessians and Hessian-vector products) of the rendering operator. Our approach is based on importance sampling of a convolution that represents the differentials of rendering parameters and shows to be applicable to both rasterization and path tracing. We further suggest an aggregate sampling strategy to importance-sample multiple dimensions of one convolution kernel simultaneously. We demonstrate that this information improves convergence when used in higher-order optimizers such as Newton or Conjugate Gradient relative to a gradient descent baseline in several inverse rendering tasks.