Broadening the Scope of Neural Network Potentials through Direct Inclusion of Additional Molecular Attributes

📅 2024-03-22
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural network potentials typically rely solely on atomic numbers and Cartesian coordinates, limiting their generalizability to chemically diverse systems with varying charge states, spin multiplicities, and other electronic configurations. This work introduces a lightweight extension strategy that, while preserving the equivariant architecture of models like TensorNet, enables end-to-end learning of charge and spin embeddings—without incorporating physics-based energy terms or task-specific modules. The approach directly alleviates input degeneracy arising from identical nuclear configurations across distinct electronic states. Experiments on both a newly curated dataset and established benchmarks (e.g., ANI-1x, QM9 subsets) demonstrate significant improvements in energy and force prediction accuracy across charge and spin states, reducing mean absolute errors by 15–30%. Crucially, the method retains the original model’s computational efficiency and out-of-distribution generalization capability. This establishes a new paradigm for developing universal, robust molecular potential energy models grounded in learnable electronic structure representations.

Technology Category

Application Category

📝 Abstract
Most state-of-the-art neural network potentials do not account for molecular attributes other than atomic numbers and positions, which limits its range of applicability by design. In this work, we demonstrate the importance of including additional electronic attributes in neural network potential representations with a minimal architectural change to TensorNet, a state-of-the-art equivariant model based on Cartesian rank-2 tensor representations. By performing experiments on both custom-made and public benchmarking datasets, we show that this modification resolves the input degeneracy issues stemming from the use of atomic numbers and positions alone, while enhancing the model's predictive accuracy across diverse chemical systems with different charge or spin states. This is accomplished without tailored strategies or the inclusion of physics-based energy terms, while maintaining efficiency and accuracy. These findings should furthermore encourage researchers to train and use models incorporating these additional representations.
Problem

Research questions and friction points this paper is trying to address.

Enhance neural network potential representations
Include electronic attributes in molecular modeling
Resolve input degeneracy in chemical systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Direct inclusion of electronic attributes
Minimal architectural change to TensorNet
Resolves input degeneracy issues
🔎 Similar Papers
No similar papers found.
G
Guillem Simeon
Computational Science Laboratory, Universitat Pompeu Fabra, Barcelona Biomedical Research Park (PRBB), C Dr. Aiguader 88, 08003 Barcelona, Spain.
A
Antonio Mirarchi
Computational Science Laboratory, Universitat Pompeu Fabra, Barcelona Biomedical Research Park (PRBB), C Dr. Aiguader 88, 08003 Barcelona, Spain.
R
Raúl P. Peláez
Computational Science Laboratory, Universitat Pompeu Fabra, Barcelona Biomedical Research Park (PRBB), C Dr. Aiguader 88, 08003 Barcelona, Spain.
R
Raimondas Galvelis
Computational Science Laboratory, Universitat Pompeu Fabra, Barcelona Biomedical Research Park (PRBB), C Dr. Aiguader 88, 08003 Barcelona, Spain.; Acellera Labs, C Dr Trueta 183, 08005, Barcelona, Spain
G
G. D. Fabritiis
Computational Science Laboratory, Universitat Pompeu Fabra, Barcelona Biomedical Research Park (PRBB), C Dr. Aiguader 88, 08003 Barcelona, Spain.; Acellera Labs, C Dr Trueta 183, 08005, Barcelona, Spain; Institució Catalana de Recerca i Estudis Avançats (ICREA), Passeig Lluis Companys 23, 08010 Barcelona, Spain