Imposing Boundary Conditions on Neural Operators via Learned Function Extensions

📅 2026-02-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural operators struggle to accurately capture the strong sensitivity of partial differential equation (PDE) solutions to complex, non-homogeneous boundary conditions. This work proposes a general framework that embeds arbitrary boundary conditions—including mixed, component-wise, and piecewise types—into domain-to-domain neural operators by learning a mapping from boundary data to a global pseudo-extension function over the entire domain. This approach enables joint modeling of input functions and boundary conditions without requiring modifications to existing architectures or dataset-specific hyperparameter tuning. Evaluated across 18 challenging datasets encompassing Poisson equations, linear elasticity, and hyperelasticity problems, the method consistently outperforms current baselines, achieving state-of-the-art accuracy.

Technology Category

Application Category

📝 Abstract
Neural operators have emerged as powerful surrogates for the solution of partial differential equations (PDEs), yet their ability to handle general, highly variable boundary conditions (BCs) remains limited. Existing approaches often fail when the solution operator exhibits strong sensitivity to boundary forcings. We propose a general framework for conditioning neural operators on complex non-homogeneous BCs through function extensions. Our key idea is to map boundary data to latent pseudo-extensions defined over the entire spatial domain, enabling any standard operator learning architecture to consume boundary information. The resulting operator, coupled with an arbitrary domain-to-domain neural operator, can learn rich dependencies on complex BCs and input domain functions at the same time. To benchmark this setting, we construct 18 challenging datasets spanning Poisson, linear elasticity, and hyperelasticity problems, with highly variable, mixed-type, component-wise, and multi-segment BCs on diverse geometries. Our approach achieves state-of-the-art accuracy, outperforming baselines by large margins, while requiring no hyperparameter tuning across datasets. Overall, our results demonstrate that learning boundary-to-domain extensions is an effective and practical strategy for imposing complex BCs in existing neural operator frameworks, enabling accurate and robust scientific machine learning models for a broader range of PDE-governed problems.
Problem

Research questions and friction points this paper is trying to address.

boundary conditions
neural operators
partial differential equations
function extensions
scientific machine learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

neural operators
boundary conditions
function extensions
scientific machine learning
PDE surrogates
🔎 Similar Papers
No similar papers found.
S
Sepehr Mousavi
Department of Mechanical and Process Engineering, ETH Zurich, Switzerland
Siddhartha Mishra
Siddhartha Mishra
Professor of Applied Mathematics, ETH Zurich, Switzerland
Applied MathematicsNumerical AnalysisScientific computingComputational Fluid and Plasma DynamicsApplied PDEs
L
L. Lorenzis
Department of Mechanical and Process Engineering, ETH Zurich, Switzerland