🤖 AI Summary
This work addresses the challenge of learning physically consistent coarse-grained dynamics from particle trajectory time series in nonequilibrium multiscale systems. We propose a structure-preserving modeling paradigm grounded in the metriplectic bracket, rigorously enforcing the first and second laws of thermodynamics, momentum conservation, and the fluctuation–dissipation relation under discrete-time evolution. A self-supervised learning strategy is introduced to implicitly identify entropy-related structural variables, thereby capturing memory effects and stochasticity. The method integrates deep learning with molecular dynamics (open-source implementations in PyTorch and LAMMPS) to enable efficient large-scale inference. Validated on video data of star polymers and colloidal suspensions, our approach achieves over 100× coarse-graining while accurately reproducing nonequilibrium statistical properties, local rearrangements, and the coupling between macroscopic stochastic dynamics and microstructural evolution—substantially outperforming conventional black-box models.
📝 Abstract
Multiscale systems are ubiquitous in science and technology, but are notoriously challenging to simulate as short spatiotemporal scales must be appropriately linked to emergent bulk physics. When expensive high-dimensional dynamical systems are coarse-grained into low-dimensional models, the entropic loss of information leads to emergent physics which are dissipative, history-dependent, and stochastic. To machine learn coarse-grained dynamics from time-series observations of particle trajectories, we propose a framework using the metriplectic bracket formalism that preserves these properties by construction; most notably, the framework guarantees discrete notions of the first and second laws of thermodynamics, conservation of momentum, and a discrete fluctuation-dissipation balance crucial for capturing non-equilibrium statistics. We introduce the mathematical framework abstractly before specializing to a particle discretization. As labels are generally unavailable for entropic state variables, we introduce a novel self-supervised learning strategy to identify emergent structural variables. We validate the method on benchmark systems and demonstrate its utility on two challenging examples: (1) coarse-graining star polymers at challenging levels of coarse-graining while preserving non-equilibrium statistics, and (2) learning models from high-speed video of colloidal suspensions that capture coupling between local rearrangement events and emergent stochastic dynamics. We provide open-source implementations in both PyTorch and LAMMPS, enabling large-scale inference and extensibility to diverse particle-based systems.