Shoot from the HIP: Hessian Interatomic Potentials without derivatives

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Computing molecular Hessian matrices is a critical bottleneck in computational chemistry; conventional quantum-mechanical or neural-network-based approaches rely on automatic differentiation or finite differences, suffering from high computational cost and poor scalability. This work introduces the first SE(3)-equivariant graph neural network that directly predicts symmetric Hessian matrices. By leveraging the irreducible representation of degree (l=2) to construct equivariant features, our method bypasses numerical differentiation and enables end-to-end learning under strict physical constraints (e.g., symmetry, rotational equivariance). It achieves state-of-the-art performance across accuracy, inference speed (10–100× acceleration), memory efficiency, and training stability. Empirically, it significantly improves downstream tasks—including transition-state search, geometry optimization, and vibrational frequency analysis. The code and pre-trained models are publicly available.

Technology Category

Application Category

📝 Abstract
Fundamental tasks in computational chemistry, from transition state search to vibrational analysis, rely on molecular Hessians, which are the second derivatives of the potential energy. Yet, Hessians are computationally expensive to calculate and scale poorly with system size, with both quantum mechanical methods and neural networks. In this work, we demonstrate that Hessians can be predicted directly from a deep learning model, without relying on automatic differentiation or finite differences. We observe that one can construct SE(3)-equivariant, symmetric Hessians from irreducible representations (irrep) features up to degree $l$=2 computed during message passing in graph neural networks. This makes HIP Hessians one to two orders of magnitude faster, more accurate, more memory efficient, easier to train, and enables more favorable scaling with system size. We validate our predictions across a wide range of downstream tasks, demonstrating consistently superior performance for transition state search, accelerated geometry optimization, zero-point energy corrections, and vibrational analysis benchmarks. We open-source the HIP codebase and model weights to enable further development of the direct prediction of Hessians at https://github.com/BurgerAndreas/hip
Problem

Research questions and friction points this paper is trying to address.

Predicting molecular Hessians directly without derivative calculations
Overcoming computational expense and poor scaling of Hessian methods
Enabling faster and more accurate quantum chemistry simulations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Direct Hessian prediction via deep learning model
SE(3)-equivariant Hessians from irreducible representations
Graph neural networks enable faster Hessian computation
🔎 Similar Papers
No similar papers found.
Andreas Burger
Andreas Burger
Bosch Research
InformatikModellierungSimulationEmbedded SystemVirtual Prototyping
L
Luca Thiede
University of Toronto, Vector Institute for Artificial Intelligence
N
Nikolaj Rønne
Technical University of Denmark, CAPeX Pioneer Center for Accelerating P2X Materials Discovery
Varinia Bernales
Varinia Bernales
University of Toronto
Theoretical ChemistryCatalysisGreen Chemistry
Nandita Vijaykumar
Nandita Vijaykumar
Assistant Professor, University of Toronto
Computer Systems and Architecture
Tejs Vegge
Tejs Vegge
Professor, Technical University of Denmark
Director of CAPeX - Pioneer Center for Accelerating P2X Materials Discovery
A
Arghya Bhowmik
Technical University of Denmark, CAPeX Pioneer Center for Accelerating P2X Materials Discovery
A
Alán Aspuru-Guzik
University of Toronto, Vector Institute for Artificial Intelligence, Acceleration Consortium, Canadian Institute for Advanced Research (CIFAR), NVIDIA