Rethinking Input Domains in Physics-Informed Neural Networks via Geometric Compactification Mappings

📅 2026-02-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of gradient stiffness and ill-conditioning in physics-informed neural networks (PINNs) when solving multiscale partial differential equations, which often arise from geometric mismatches between input coordinates and local high-frequency solution structures, leading to poor convergence. To overcome this, the authors propose the GC-PINN framework, which introduces a differentiable geometric compactification mapping that adaptively aligns complex solution features by coupling the PDE’s geometric structure with the spectral properties of the residual operator—without altering the original PINN architecture. Three input-domain mapping strategies are specifically designed for representative scenarios involving periodic boundaries, far-field extensions, and local singularities. Experiments on one- and two-dimensional benchmark problems demonstrate that GC-PINN significantly enhances training stability, convergence speed, and solution accuracy, while yielding a more uniform residual distribution.

Technology Category

Application Category

📝 Abstract
Several complex physical systems are governed by multi-scale partial differential equations (PDEs) that exhibit both smooth low-frequency components and localized high-frequency structures. Existing physics-informed neural network (PINN) methods typically train with fixed coordinate system inputs, where geometric misalignment with these structures induces gradient stiffness and ill-conditioning that hinder convergence. To address this issue, we introduce a mapping paradigm that reshapes the input coordinates through differentiable geometric compactification mappings and couples the geometric structure of PDEs with the spectral properties of residual operators. Based on this paradigm, we propose Geometric Compactification (GC)-PINN, a framework that introduces three mapping strategies for periodic boundaries, far-field scale expansion, and localized singular structures in the input domain without modifying the underlying PINN architecture. Extensive empirical evaluation demonstrates that this approach yields more uniform residual distributions and higher solution accuracy on representative 1D and 2D PDEs, while improving training stability and convergence speed.
Problem

Research questions and friction points this paper is trying to address.

physics-informed neural networks
multi-scale PDEs
gradient stiffness
geometric misalignment
ill-conditioning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Geometric Compactification
Physics-Informed Neural Networks
Input Domain Mapping
Multi-scale PDEs
Residual Operator Spectral Alignment
🔎 Similar Papers
No similar papers found.
Z
Zhenzhen Huang
School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu, China
H
Haoyu Bian
Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, China
J
Jiaquan Zhang
School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu, China
Y
Yibei Liu
School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu, China
K
Kuien Liu
Institute of Software, Chinese Academy of Sciences, Beijing, China
C
Caiyan Qin
School of Robotics and Advanced Manufacture, Harbin Institute of Technology, Shenzhen, China
Guoqing Wang
Guoqing Wang
University of Electronic Science and Technology of China
Computer VisionMachine LearningPattern RecognitionIntelligent System
Y
Yang Yang
Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, China
Chaoning Zhang
Chaoning Zhang
Professor at UESTC (电子科技大学, China)
Computer VisionLLM and VLMGenAI and AIGC Detection