Implicit differentiation with second-order derivatives and benchmarks in finite-element-based differentiable physics

📅 2025-05-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the longstanding absence of second-order derivatives (Hessians) for implicit functions in differentiable finite element physics. We present the first systematic derivation and implementation of an implicit Hessian-vector product algorithm based on primitive automatic differentiation operators, enabling PDE-constrained optimization. Our method leverages Jacobian-vector and vector-Jacobian products as core primitives, enabling seamless integration of Newton-CG and L-BFGS-B optimizers into finite element solvers. We establish a verifiable and scalable second-order implicit differentiation paradigm, thereby bridging a critical theoretical and practical gap in differentiable physics concerning Hessian computation. The approach is validated on four 2D/3D benchmark problems—spanning both linear and nonlinear regimes—with verified numerical accuracy. When combined with exact Hessians, Newton-CG accelerates convergence by 2–5× for traction identification and shape optimization tasks.

Technology Category

Application Category

📝 Abstract
Differentiable programming is revolutionizing computational science by enabling automatic differentiation (AD) of numerical simulations. While first-order gradients are well-established, second-order derivatives (Hessians) for implicit functions in finite-element-based differentiable physics remain underexplored. This work bridges this gap by deriving and implementing a framework for implicit Hessian computation in PDE-constrained optimization problems. We leverage primitive AD tools (Jacobian-vector product/vector-Jacobian product) to build an algorithm for Hessian-vector products and validate the accuracy against finite difference approximations. Four benchmarks spanning linear/nonlinear, 2D/3D, and single/coupled-variable problems demonstrate the utility of second-order information. Results show that the Newton-CG method with exact Hessians accelerates convergence for nonlinear inverse problems (e.g., traction force identification, shape optimization), while the L-BFGS-B method suffices for linear cases. Our work provides a robust foundation for integrating second-order implicit differentiation into differentiable physics engines, enabling faster and more reliable optimization.
Problem

Research questions and friction points this paper is trying to address.

Develops implicit Hessian computation for PDE-constrained optimization
Validates accuracy of second-order derivatives in differentiable physics
Demonstrates faster convergence in nonlinear inverse problems using exact Hessians
Innovation

Methods, ideas, or system contributions that make the work stand out.

Implicit Hessian computation for PDE-constrained optimization
Leveraging AD tools for Hessian-vector products
Validating accuracy with finite difference benchmarks
🔎 Similar Papers
No similar papers found.