Data-Driven Model Reduction using WeldNet: Windowed Encoders for Learning Dynamics

📅 2025-12-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of simultaneously achieving dimensionality reduction and dynamical fidelity in modeling high-dimensional time-varying physical systems, this paper proposes WeldNet: a novel framework that partitions spatiotemporal data into sliding temporal windows; applies intra-window deep autoencoders for nonlinear manifold-based dimensionality reduction; enforces inter-window temporal consistency via a cross-window transformer; and evolves dynamics in the latent space using a dedicated propagation network. WeldNet introduces the first windowed nonlinear dimensionality reduction paradigm and provides theoretical guarantees of complete representational capacity under the manifold hypothesis. Evaluated on diverse partial differential equation (PDE) modeling tasks, WeldNet consistently outperforms proper orthogonal decomposition (POD) and state-of-the-art nonlinear methods, accurately capturing strong nonlinear structures and long-term dynamical behavior while maintaining both local accuracy and global stability.

Technology Category

Application Category

📝 Abstract
Many problems in science and engineering involve time-dependent, high dimensional datasets arising from complex physical processes, which are costly to simulate. In this work, we propose WeldNet: Windowed Encoders for Learning Dynamics, a data-driven nonlinear model reduction framework to build a low-dimensional surrogate model for complex evolution systems. Given time-dependent training data, we split the time domain into multiple overlapping windows, within which nonlinear dimension reduction is performed by auto-encoders to capture latent codes. Once a low-dimensional representation of the data is learned, a propagator network is trained to capture the evolution of the latent codes in each window, and a transcoder is trained to connect the latent codes between adjacent windows. The proposed windowed decomposition significantly simplifies propagator training by breaking long-horizon dynamics into multiple short, manageable segments, while the transcoders ensure consistency across windows. In addition to the algorithmic framework, we develop a mathematical theory establishing the representation power of WeldNet under the manifold hypothesis, justifying the success of nonlinear model reduction via deep autoencoder-based architectures. Our numerical experiments on various differential equations indicate that WeldNet can capture nonlinear latent structures and their underlying dynamics, outperforming both traditional projection-based approaches and recently developed nonlinear model reduction methods.
Problem

Research questions and friction points this paper is trying to address.

Develops a data-driven nonlinear model reduction framework for complex evolution systems
Proposes windowed auto-encoders to capture latent dynamics in overlapping time segments
Trains propagator and transcoder networks to model and connect latent evolution across windows
Innovation

Methods, ideas, or system contributions that make the work stand out.

Windowed auto-encoders reduce high-dimensional dynamics
Propagator and transcoder networks model latent evolution
Mathematical theory supports deep autoencoder representation power
B
Biraj Dahal
School of Mathematics, Georgia Institute of Technology, Atlanta, GA, USA
J
Jiahui Cheng
Meta, USA
H
Hao Liu
Department of Mathematics, Hong Kong Baptist University, Hong Kong, China
Rongjie Lai
Rongjie Lai
Professor of Mathematics, Purdue University
Applied and Computational Mathematics
Wenjing Liao
Wenjing Liao
Georgia Institute of Techonology
Data analysismachine learningimaging and signal processingapplied harmonic analysis