Disentangled Latent Dynamics Manifold Fusion for Solving Parameterized PDEs

📅 2026-03-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing methods for solving parametric partial differential equations struggle to simultaneously achieve parameter generalization and temporal extrapolation, often suffering from optimization instability and inadequate dynamic modeling. This work proposes a physics-informed decoupled framework that maps parameters into continuous latent embeddings to initialize a conditional neural ODE, which models spatiotemporal dynamics in the latent space. A shared decoder then fuses these dynamic trajectories via a manifold fusion mechanism, eliminating the need for costly self-decoding during inference while preserving solution continuity and consistency. By uniquely integrating continuous parameter embeddings with neural ODEs and introducing manifold fusion, the method significantly outperforms existing approaches across multiple benchmark problems, achieving state-of-the-art performance in accuracy, parameter generalization, and long-term temporal extrapolation.

Technology Category

Application Category

📝 Abstract
Generalizing neural surrogate models across different PDE parameters remains difficult because changes in PDE coefficients often make learning harder and optimization less stable. The problem becomes even more severe when the model must also predict beyond the training time range. Existing methods usually cannot handle parameter generalization and temporal extrapolation at the same time. Standard parameterized models treat time as just another input and therefore fail to capture intrinsic dynamics, while recent continuous-time latent methods often rely on expensive test-time auto-decoding for each instance, which is inefficient and can disrupt continuity across the parameterized solution space. To address this, we propose Disentangled Latent Dynamics Manifold Fusion (DLDMF), a physics-informed framework that explicitly separates space, time, and parameters. Instead of unstable auto-decoding, DLDMF maps PDE parameters directly to a continuous latent embedding through a feed-forward network. This embedding initializes and conditions a latent state whose evolution is governed by a parameter-conditioned Neural ODE. We further introduce a dynamic manifold fusion mechanism that uses a shared decoder to combine spatial coordinates, parameter embeddings, and time-evolving latent states to reconstruct the corresponding spatiotemporal solution. By modeling prediction as latent dynamic evolution rather than static coordinate fitting, DLDMF reduces interference between parameter variation and temporal evolution while preserving a smooth and coherent solution manifold. As a result, it performs well on unseen parameter settings and in long-term temporal extrapolation. Experiments on several benchmark problems show that DLDMF consistently outperforms state-of-the-art baselines in accuracy, parameter generalization, and extrapolation robustness.
Problem

Research questions and friction points this paper is trying to address.

parameterized PDEs
parameter generalization
temporal extrapolation
latent dynamics
solution manifold
Innovation

Methods, ideas, or system contributions that make the work stand out.

Disentangled Latent Dynamics
Neural ODE
Parameterized PDEs
Manifold Fusion
Temporal Extrapolation
🔎 Similar Papers
No similar papers found.
Z
Zhangyong Liang
National Center for Applied Mathematics, Tianjin University, Tianjin, 300072, PR China
Ji Zhang
Ji Zhang
The University Southern Queensland