$φ-$DeepONet: A Discontinuity Capturing Neural Operator

📅 2026-04-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge that conventional neural operators struggle to model strong or weak discontinuities in inputs or outputs arising from material interfaces. To overcome this limitation, the authors propose φ-DeepONet, which for the first time integrates one-hot encoded domain decomposition into a neural operator architecture. The method employs a multi-branch network to handle discontinuous inputs and fuses interface-aware one-hot encodings with spatial coordinates in the trunk network to construct a nonlinear latent embedding capable of capturing output discontinuities. A joint loss function incorporating physical laws and interface information is also introduced. By relaxing the continuity assumptions inherent in classical neural operators, φ-DeepONet achieves accurate and stable predictions across multiple one- and two-dimensional benchmark problems, demonstrating robust performance even in the presence of strong interfacial discontinuities.
📝 Abstract
We present $φ-$DeepONet, a physics-informed neural operator designed to learn mappings between function spaces that may contain discontinuities or exhibit non-smooth behavior. Classical neural operators are based on the universal approximation theorem which assumes that both the operator and the functions it acts on are continuous. However, many scientific and engineering problems involve naturally discontinuous input fields as well as strong and weak discontinuities in the output fields caused by material interfaces. In $φ$-DeepONet, discontinuities in the input are handled using multiple branch networks, while discontinuities in the output are learned through a nonlinear latent embedding of the interface. This embedding is constructed from a {\it one-hot} representation of the domain decomposition that is combined with the spatial coordinates in a modified trunk network. The outputs of the branch and trunk networks are then combined through a dot product to produce the final solution, which is trained using a physics- and interface-informed loss function. We evaluate $φ$-DeepONet on several one- and two-dimensional benchmark problems and demonstrate that it delivers accurate and stable predictions even in the presence of strong interface-driven discontinuities.
Problem

Research questions and friction points this paper is trying to address.

discontinuities
neural operators
function spaces
material interfaces
non-smooth behavior
Innovation

Methods, ideas, or system contributions that make the work stand out.

neural operator
discontinuity capturing
physics-informed learning
domain decomposition
interface embedding
🔎 Similar Papers
2024-09-20World Scientific Annual Review of Artificial IntelligenceCitations: 1
S
Sumanta Roy
Department of Civil and Systems Engineering, Johns Hopkins University
S
Stephen T. Castonguay
Computational Engineering Division, Lawrence Livermore National Laboratory
Pratanu Roy
Pratanu Roy
Lawrence Livermore National Laboratory
Scientific Machine LearningHigh Performance ComputingComputational Fluid Dynamics
Michael D. Shields
Michael D. Shields
Johns Hopkins University
uncertainty quantificationcomputational mechanicsstochastic processes