Towards A Universally Transferable Acceleration Method for Density Functional Theory

📅 2025-09-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing deep learning approaches for accelerating density functional theory (DFT) rely on predicting Hamiltonian matrices to initialize self-consistent field (SCF) calculations; however, Hamiltonians are numerically unstable and exhibit poor transferability across molecular systems, basis sets, and exchange-correlation (XC) functionals. Method: We propose the first E(3)-equivariant neural network framework that directly predicts the electron density in a compact auxiliary basis—bypassing Hamiltonian or density matrix prediction—and serves as a transferable SCF initial guess. Contribution/Results: Trained solely on small molecules, our model generalizes robustly to systems with up to 60 atoms. It achieves strong cross-system, cross-basis-set, and cross-XC-functional transferability—surpassing prior Hamiltonian- or density-matrix-based paradigms. Experiments show an average reduction of 33.3% in SCF iterations, with consistent acceleration across system sizes. This significantly enhances both the efficiency and universality of DFT computations.

Technology Category

Application Category

📝 Abstract
Recently, sophisticated deep learning-based approaches have been developed for generating efficient initial guesses to accelerate the convergence of density functional theory (DFT) calculations. While the actual initial guesses are often density matrices (DM), quantities that can convert into density matrices also qualify as alternative forms of initial guesses. Hence, existing works mostly rely on the prediction of the Hamiltonian matrix for obtaining high-quality initial guesses. However, the Hamiltonian matrix is both numerically difficult to predict and intrinsically non-transferable, hindering the application of such models in real scenarios. In light of this, we propose a method that constructs DFT initial guesses by predicting the electron density in a compact auxiliary basis representation using E(3)-equivariant neural networks. Trained on small molecules with up to 20 atoms, our model is able to achieve an average 33.3% self-consistent field (SCF) step reduction on systems up to 60 atoms, substantially outperforming Hamiltonian-centric and DM-centric models. Critically, this acceleration remains nearly constant with increasing system sizes and exhibits strong transferring behaviors across orbital basis sets and exchange-correlation (XC) functionals. To the best of our knowledge, this work represents the first and robust candidate for a universally transferable DFT acceleration method. We are also releasing the SCFbench dataset and its accompanying code to facilitate future research in this promising direction.
Problem

Research questions and friction points this paper is trying to address.

Developing transferable initial guesses for DFT convergence acceleration
Overcoming Hamiltonian prediction limitations in deep learning DFT methods
Creating universally applicable acceleration across system sizes and functionals
Innovation

Methods, ideas, or system contributions that make the work stand out.

Predicts electron density using E(3)-equivariant neural networks
Employs compact auxiliary basis representation for transferability
Achieves significant SCF step reduction across system sizes
🔎 Similar Papers
No similar papers found.
Z
Zhe Liu
ByteDance Seed
Y
Yuyan Ni
ByteDance Seed
Z
Zhichen Pu
ByteDance Seed
Qiming Sun
Qiming Sun
California Institute of Technology
Theoretical chemistry and physics
S
Siyuan Liu
ByteDance Seed
W
Wen Yan
ByteDance Seed