GeoMaNO: Geometric Mamba Neural Operator for Partial Differential Equations

📅 2025-05-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing Transformer-based neural operators for solving partial differential equations (PDEs) suffer from high computational complexity (O(N²)), imprecise geometric modeling, and poor generalization beyond structured grids. To address these limitations, this work introduces Mamba—a state-space model (SSM)-based architecture—into neural operators for the first time, yielding a novel framework that ensures both geometric consistency and linear-time complexity. Specifically, we model long-range dependencies via structured SSMs; incorporate coordinate-aware geometric embeddings to explicitly encode differential-geometric priors; and propose a PDE-aware, discretization-invariant parameterization strategy. Evaluated on canonical benchmarks—including Darcy flow and Navier–Stokes equations—our method reduces solution operator approximation error by up to 58.9% compared to current state-of-the-art approaches. This establishes a new paradigm for efficient, geometry-respecting PDE learning.

Technology Category

Application Category

📝 Abstract
The neural operator (NO) framework has emerged as a powerful tool for solving partial differential equations (PDEs). Recent NOs are dominated by the Transformer architecture, which offers NOs the capability to capture long-range dependencies in PDE dynamics. However, existing Transformer-based NOs suffer from quadratic complexity, lack geometric rigor, and thus suffer from sub-optimal performance on regular grids. As a remedy, we propose the Geometric Mamba Neural Operator (GeoMaNO) framework, which empowers NOs with Mamba's modeling capability, linear complexity, plus geometric rigor. We evaluate GeoMaNO's performance on multiple standard and popularly employed PDE benchmarks, spanning from Darcy flow problems to Navier-Stokes problems. GeoMaNO improves existing baselines in solution operator approximation by as much as 58.9%.
Problem

Research questions and friction points this paper is trying to address.

Overcomes quadratic complexity in Transformer-based neural operators
Addresses lack of geometric rigor in PDE solutions
Improves performance on regular grids for PDEs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Mamba's modeling capability for PDEs
Achieves linear complexity in computations
Incorporates geometric rigor in solutions
🔎 Similar Papers
No similar papers found.
X
Xi Han
Department of Computer Science, Stony Brook University
J
Jingwei Zhang
Department of Computer Science, Stony Brook University
Dimitris Samaras
Dimitris Samaras
Stony Brook University
Computer VisionMachine LearningComputer GraphicsMedical Imaging
Fei Hou
Fei Hou
Institute of Software, Chinese Academy of Sciences
Computer Graphics
H
Hong Qin
Department of Computer Science, Stony Brook University