π€ AI Summary
Machine learning models struggle with convergence and out-of-distribution (OOD) generalization in the vast organic chemical space (10Β³β°β10βΆβ° molecules). Method: We propose a dual-axis representationally complete convergence learning framework: (i) a GCN-based local valence environment encoder grounded in modern valence bond theory, and (ii) a bridgeless graph (NBG) formalism for global ring/cage topology encoding. We formally define molecular representation completeness and establish its theoretical links to dataset construction and model convergence. Based on this principle, we construct FD25βa systematically curated dataset covering near-exhaustive combinations of local valence motifs and ring/cage topologies for H/C/N/O/F-containing molecules. Results: Our model achieves ~1.0 kcal/mol MAE on external benchmarks, markedly improving OOD generalization. Empirical analysis quantitatively validates the causal roles of representation completeness in accelerating convergence, enhancing interpretability, and boosting data efficiency.
π Abstract
Machine learning is profoundly reshaping molecular and materials modeling; however, given the vast scale of chemical space (10^30-10^60), it remains an open scientific question whether models can achieve convergent learning across this space. We introduce a Dual-Axis Representation-Complete Convergent Learning (RCCL) strategy, enabled by a molecular representation that integrates graph convolutional network (GCN) encoding of local valence environments, grounded in modern valence bond theory, together with no-bridge graph (NBG) encoding of ring/cage topologies, providing a quantitative measure of chemical-space coverage. This framework formalizes representation completeness, establishing a principled basis for constructing datasets that support convergent learning for large models. Guided by this RCCL framework, we develop the FD25 dataset, systematically covering 13,302 local valence units and 165,726 ring/cage topologies, achieving near-complete combinatorial coverage of organic molecules with H/C/N/O/F elements. Graph neural networks trained on FD25 exhibit representation-complete convergent learning and strong out-of-distribution generalization, with an overall prediction error of approximately 1.0 kcal/mol MAE across external benchmarks. Our results establish a quantitative link between molecular representation, structural completeness, and model generalization, providing a foundation for interpretable, transferable, and data-efficient molecular intelligence.