🤖 AI Summary
This work addresses the challenge of effectively characterizing metastable transition processes in high-dimensional, complex molecular systems by proposing a dimensionality-reduction framework that integrates Koopman operator theory, Krylov subspace algorithms, and neural networks. The method uniquely couples Koopman-driven neural learning with coarse-grained dynamics to automatically extract dominant invariant subspaces from simulation data, thereby constructing optimal collective variables and deriving their low-dimensional effective dynamics. The resulting framework systematically reconstructs reaction pathways and transition times across both enthalpic and entropic barriers, accurately computes rate constants and flux functions, and successfully reproduces—and predicts—coarse-grained dynamical behavior on multiple benchmark potential energy surfaces.
📝 Abstract
The ISOKANN (Invariant Subspaces of Koopman Operators Learned by Artificial Neural Networks) framework provides a data-driven route to extract collective variables (CVs) and effective dynamics from complex molecular systems. In this work, we integrate the theoretical foundation of Koopman operators with Krylov-like subspace algorithms, and reduced dynamical modeling to build a coherent picture of how to describe metastable transitions in high-dimensional systems based on CVs. Starting from the identification of CVs based on dominant invariant subspaces, we derive the corresponding effective dynamics on the latent space and connect these to transition rates and times, committor functions, and transition pathways. The combination of Koopman-based learning and reduced-dimensional effective dynamics yields a principled framework for computing transition rates and pathways from simulation data. Numerical experiments on one-, two-, and three-dimensional benchmark potentials illustrate the ability of ISOKANN to reconstruct the coarse-grained kinetics and reproduce transition times across enthalpic and entropic barriers.