Riemann Tensor Neural Networks: Learning Conservative Systems with Physics-Constrained Networks

📅 2025-03-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge that conventional neural networks cannot rigorously satisfy mass and momentum conservation laws from continuum mechanics. To this end, we propose a novel architecture enforcing strong physics-based constraints. Methodologically, we introduce divergence-free symmetric tensors (DFSTs) as hard inductive biases—first of their kind—and design the Riemannian Tensor Neural Network (RTNN), whose architecture is grounded in the structural properties of the Riemann curvature tensor. By integrating differential-geometric constraints with universal approximation theory, RTNN guarantees exact conservation up to machine precision. We theoretically prove that RTNN can approximate any smooth DFST to arbitrary accuracy. Empirically, RTNN achieves state-of-the-art performance on surrogate modeling tasks for diverse conservative PDEs, significantly outperforming existing physics-informed neural networks (PINNs) in both fidelity and generalization.

Technology Category

Application Category

📝 Abstract
Divergence-free symmetric tensors (DFSTs) are fundamental in continuum mechanics, encoding conservation laws such as mass and momentum conservation. We introduce Riemann Tensor Neural Networks (RTNNs), a novel neural architecture that inherently satisfies the DFST condition to machine precision, providing a strong inductive bias for enforcing these conservation laws. We prove that RTNNs can approximate any sufficiently smooth DFST with arbitrary precision and demonstrate their effectiveness as surrogates for conservative PDEs, achieving improved accuracy across benchmarks. This work is the first to use DFSTs as an inductive bias in neural PDE surrogates and to explicitly enforce the conservation of both mass and momentum within a physics-constrained neural architecture.
Problem

Research questions and friction points this paper is trying to address.

Learning conservative systems using physics-constrained neural networks.
Enforcing mass and momentum conservation in neural PDE surrogates.
Approximating divergence-free symmetric tensors with high precision.
Innovation

Methods, ideas, or system contributions that make the work stand out.

RTNNs enforce DFST conditions precisely
RTNNs approximate smooth DFSTs accurately
RTNNs conserve mass and momentum inherently
🔎 Similar Papers
No similar papers found.