🤖 AI Summary
To address the poor generalizability and scalability of existing crystal modeling approaches—which require designing separate architectures for each of the 230 space groups—we propose the first unified neural architecture capable of automatically adapting to any 3D space group while strictly enforcing its symmetry invariance. Our core innovation is a symmetry-adaptive Fourier basis with explicit constraint encoding, integrating group representation theory with differentiable weight-sharing to impose exact symmetry constraints on basis coefficients under space-group actions. This enables zero-shot transfer of a single trained model to unseen space groups, facilitating cross-group knowledge sharing and robust few-shot learning. On materials property prediction benchmarks, our method achieves state-of-the-art performance and significantly alleviates the generalization bottleneck caused by sparse data across space groups.
📝 Abstract
Incorporating known symmetries in data into machine learning models has consistently improved predictive accuracy, robustness, and generalization. However, achieving exact invariance to specific symmetries typically requires designing bespoke architectures for each group of symmetries, limiting scalability and preventing knowledge transfer across related symmetries. In the case of the space groups, symmetries critical to modeling crystalline solids in materials science and condensed matter physics, this challenge is particularly salient as there are 230 such groups in three dimensions. In this work we present a new approach to such crystallographic symmetries by developing a single machine learning architecture that is capable of adapting its weights automatically to enforce invariance to any input space group. Our approach is based on constructing symmetry-adapted Fourier bases through an explicit characterization of constraints that group operations impose on Fourier coefficients. Encoding these constraints into a neural network layer enables weight sharing across different space groups, allowing the model to leverage structural similarities between groups and overcome data sparsity when limited measurements are available for specific groups. We demonstrate the effectiveness of this approach in achieving competitive performance on material property prediction tasks and performing zero-shot learning to generalize to unseen groups.