HyPINO: Multi-Physics Neural Operators via HyperPINNs and the Method of Manufactured Solutions

📅 2025-09-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Zero-shot generalization for multi-physics parametric partial differential equations (PDEs) remains challenging due to disparities across equation types (elliptic/hyperbolic/parabolic), geometries, and boundary conditions. Method: We propose HyPINO—the first multi-physics neural operator enabling zero-shot generalization across PDE classes, domains, and boundary conditions without task-specific fine-tuning. Its core innovations include: (1) a Swin Transformer-based hypernetwork that generates physics-informed neural networks; (2) hybrid training combining analytically derived labels from manufactured solutions with physics-constrained unsupervised losses; and (3) an error-feedback-driven iterative refinement mechanism for high-accuracy forward inference. Results: On seven benchmark tasks, HyPINO reduces zero-shot L₂ error by over two orders of magnitude on average, outperforming U-Net, Poseidon, and PINO. Moreover, it serves as a high-quality initialization that accelerates subsequent PINN fine-tuning while improving both convergence speed and final accuracy.

Technology Category

Application Category

📝 Abstract
We present HyPINO, a multi-physics neural operator designed for zero-shot generalization across a broad class of parametric PDEs without requiring task-specific fine-tuning. Our approach combines a Swin Transformer-based hypernetwork with mixed supervision: (i) labeled data from analytical solutions generated via the Method of Manufactured Solutions (MMS), and (ii) unlabeled samples optimized using physics-informed objectives. The model maps PDE parametrizations to target Physics-Informed Neural Networks (PINNs) and can handle linear elliptic, hyperbolic, and parabolic equations in two dimensions with varying source terms, geometries, and mixed Dirichlet/Neumann boundary conditions, including interior boundaries. HyPINO achieves strong zero-shot accuracy on seven benchmark problems from PINN literature, outperforming U-Nets, Poseidon, and Physics-Informed Neural Operators (PINO). Further, we introduce an iterative refinement procedure that compares the physics of the generated PINN to the requested PDE and uses the discrepancy to generate a "delta" PINN. Summing their contributions and repeating this process forms an ensemble whose combined solution progressively reduces the error on six benchmarks and achieves over 100x gain in average $L_2$ loss in the best case, while retaining forward-only inference. Additionally, we evaluate the fine-tuning behavior of PINNs initialized by HyPINO and show that they converge faster and to lower final error than both randomly initialized and Reptile-meta-learned PINNs on five benchmarks, performing on par on the remaining two. Our results highlight the potential of this scalable approach as a foundation for extending neural operators toward solving increasingly complex, nonlinear, and high-dimensional PDE problems with significantly improved accuracy and reduced computational cost.
Problem

Research questions and friction points this paper is trying to address.

Solving parametric PDEs with zero-shot generalization across diverse physical scenarios
Handling varying geometries and boundary conditions without task-specific fine-tuning
Improving accuracy and reducing computational cost for complex nonlinear PDEs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses hypernetwork with Swin Transformer for parametric PDE mapping
Combines manufactured solutions with physics-informed loss supervision
Implements iterative refinement with delta PINNs for error reduction
🔎 Similar Papers
No similar papers found.