Time Extrapolation with Graph Convolutional Autoencoder and Tensor Train Decomposition

📅 2025-11-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of time-extrapolative prediction for parametric partial differential equations (pPDEs) on unstructured meshes. Methodologically, it proposes a multi-stage, multi-fidelity reduced-order modeling framework integrating a graph convolutional autoencoder (GCAE) with tensor-train (TT) decomposition: GCAE enables geometry-aware spatial encoding; TT decomposition compresses high-dimensional spatiotemporal-parameter coupled features; and operator inference (OpInf) combined with a DeepONet architecture explicitly models nonlinear dynamical evolution while enforcing temporal causality and parameter generalizability. The key contribution is the first synergistic integration of graph neural networks, tensor decomposition, and operator learning into an extrapolation-oriented reduced-order model. Evaluated on benchmark problems—including heat conduction, convection-diffusion, and vortex shedding—the framework achieves significantly higher accuracy, longer-time validity, and superior cross-parameter robustness compared to MeshGraphNets.

Technology Category

Application Category

📝 Abstract
Graph autoencoders have gained attention in nonlinear reduced-order modeling of parameterized partial differential equations defined on unstructured grids. Despite they provide a geometrically consistent way of treating complex domains, applying such architectures to parameterized dynamical systems for temporal prediction beyond the training data, i.e. the extrapolation regime, is still a challenging task due to the simultaneous need of temporal causality and generalizability in the parametric space. In this work, we explore the integration of graph convolutional autoencoders (GCAs) with tensor train (TT) decomposition and Operator Inference (OpInf) to develop a time-consistent reduced-order model. In particular, high-fidelity snapshots are represented as a combination of parametric, spatial, and temporal cores via TT decomposition, while OpInf is used to learn the evolution of the latter. Moreover, we enhance the generalization performance by developing a multi-fidelity two-stages approach in the framework of Deep Operator Networks (DeepONet), treating the spatial and temporal cores as the trunk networks, and the parametric core as the branch network. Numerical results, including heat-conduction, advection-diffusion and vortex-shedding phenomena, demonstrate great performance in effectively learning the dynamic in the extrapolation regime for complex geometries, also in comparison with state-of-the-art approaches e.g. MeshGraphNets.
Problem

Research questions and friction points this paper is trying to address.

Develops a time-consistent reduced-order model for parameterized PDEs on unstructured grids
Integrates graph autoencoders with tensor decomposition for temporal extrapolation
Enhances generalization via a multi-fidelity DeepONet framework for complex geometries
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph convolutional autoencoders integrate tensor train decomposition
Operator Inference learns evolution of temporal cores
Multi-fidelity DeepONet enhances generalization via trunk-branch networks
🔎 Similar Papers
No similar papers found.