🤖 AI Summary
Quantum dynamical response prediction faces computational bottlenecks and strong data dependency due to high-dimensional Hilbert-space evolution, while ensuring physical consistency remains challenging. To address this, we propose the Chain of KANs architecture: a time-causal chain of Kolmogorov–Arnold Networks (KANs) integrated with an Ehrenfest-theorem-based physics-informed loss function, enabling interpretable and mathematically rigorous modeling of quantum time evolution. Compared to benchmark models such as temporal convolutional networks (TCNs), our method achieves superior accuracy using only 200 training samples—just 5.4% of the TCN’s requirement—significantly reducing data demand. Moreover, it effectively suppresses spurious oscillations, enhancing generalization and enforcing strict adherence to quantum mechanical constraints. This work establishes a novel paradigm for high-fidelity, small-data quantum dynamical modeling.
📝 Abstract
The prediction of quantum dynamical responses lies at the heart of modern physics. Yet, modeling these time-dependent behaviors remains a formidable challenge because quantum systems evolve in high-dimensional Hilbert spaces, often rendering traditional numerical methods computationally prohibitive. While large language models have achieved remarkable success in sequential prediction, quantum dynamics presents a fundamentally different challenge: forecasting the entire temporal evolution of quantum systems rather than merely the next element in a sequence. Existing neural architectures such as recurrent and convolutional networks often require vast training datasets and suffer from spurious oscillations that compromise physical interpretability. In this work, we introduce a fundamentally new approach: Kolmogorov Arnold Networks (KANs) augmented with physics-informed loss functions that enforce the Ehrenfest theorems. Our method achieves superior accuracy with significantly less training data: it requires only 5.4 percent of the samples (200) compared to Temporal Convolution Networks (3,700). We further introduce the Chain of KANs, a novel architecture that embeds temporal causality directly into the model design, making it particularly well-suited for time series modeling. Our results demonstrate that physics-informed KANs offer a compelling advantage over conventional black-box models, maintaining both mathematical rigor and physical consistency while dramatically reducing data requirements.