🤖 AI Summary
Standard DeepONets struggle to capture boundary layers, sharp gradients, and non-periodic structures when learning operators for partial differential equations (PDEs) on bounded domains. To address this, we propose a spectral embedding backbone network that introduces non-periodic Chebyshev polynomials as coordinate embeddings—marking the first use of such embeddings in DeepONet architectures. This design incorporates problem-aware inductive biases tailored for Dirichlet and Neumann boundary value problems, significantly enhancing the model’s capacity to resolve high-frequency oscillations and localized boundary features. Evaluated across five canonical PDE benchmarks, our method achieves an average 30–40% reduction in relative L² error compared to standard DeepONet and outperforms Fourier-based embedding variants across all tasks—particularly excelling in non-periodic geometric settings where conventional spectral embeddings fail. The approach thus advances operator learning for bounded-domain PDEs by unifying spectral approximation theory with neural operator architecture design.
📝 Abstract
Deep Operator Networks (DeepONets) have become a central tool in data-driven operator learning, providing flexible surrogates for nonlinear mappings arising in partial differential equations (PDEs). However, the standard trunk design based on fully connected layers acting on raw spatial or spatiotemporal coordinates struggles to represent sharp gradients, boundary layers, and non-periodic structures commonly found in PDEs posed on bounded domains with Dirichlet or Neumann boundary conditions. To address these limitations, we introduce the Spectral-Embedded DeepONet (SEDONet), a new DeepONet variant in which the trunk is driven by a fixed Chebyshev spectral dictionary rather than coordinate inputs. This non-periodic spectral embedding provides a principled inductive bias tailored to bounded domains, enabling the learned operator to capture fine-scale non-periodic features that are difficult for Fourier or MLP trunks to represent. SEDONet is evaluated on a suite of PDE benchmarks including 2D Poisson, 1D Burgers, 1D advection-diffusion, Allen-Cahn dynamics, and the Lorenz-96 chaotic system, covering elliptic, parabolic, advective, and multiscale temporal phenomena, all of which can be viewed as canonical problems in computational mechanics. Across all datasets, SEDONet consistently achieves the lowest relative L2 errors among DeepONet, FEDONet, and SEDONet, with average improvements of about 30-40% over the baseline DeepONet and meaningful gains over Fourier-embedded variants on non-periodic geometries. Spectral analyses further show that SEDONet more accurately preserves high-frequency and boundary-localized features, demonstrating the value of Chebyshev embeddings in non-periodic operator learning. The proposed architecture offers a simple, parameter-neutral modification to DeepONets, delivering a robust and efficient spectral framework for surrogate modeling of PDEs on bounded domains.