chemtrain: Learning Deep Potential Models via Automatic Differentiation and Statistical Physics

📅 2024-08-28
🏛️ Computer Physics Communications
📈 Citations: 6
Influential: 0
📄 PDF
🤖 AI Summary
To address the high data cost and low utilization efficiency in training atomic- and coarse-grained implicit-solvent neural network potentials, this work proposes an end-to-end differentiable deep potential joint optimization framework. The method integrates thermodynamic constraints—specifically, free energy consistency—directly into the automatic differentiation system, enabling physics-guided, concurrent training of potential energy, forces, and free energies. It synergistically combines JAX/TensorFlow-based automatic differentiation, statistical-physics-informed loss functions, multi-scale joint optimization of energies and forces, and Monte Carlo sampling-driven active learning. Evaluated across diverse molecular systems, the framework achieves DFT-level accuracy (mean absolute error < 1 meV/atom), improves training efficiency by a factor of three, and demonstrates significantly enhanced generalization compared to conventional fitting approaches.

Technology Category

Application Category

Problem

Research questions and friction points this paper is trying to address.

Overcoming costly generation of accurate reference data
Addressing data inefficiency in bottom-up training methods
Combining multiple training algorithms for improved NN potential models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines top-down and bottom-up training algorithms
Uses JAX for gradient computation and scaling
Customizable training routines for diverse data sources
🔎 Similar Papers
No similar papers found.