Unlearnable phases of matter

📅 2026-02-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the fundamental limitations of unsupervised machine learning in identifying mixed-state quantum phases that are locally indistinguishable yet possess global characteristics. By analyzing the failure mechanisms of autoregressive neural networks in capturing long-range conditional mutual information (CMI) distributions, the study establishes CMI as an effective diagnostic for local indistinguishability and proves the computational hardness of learning such phases within the statistical query model. Innovatively, it proposes learning hardness itself as a novel probe for detecting mixed-state phases, phase transitions, and quantum error correction thresholds, introducing “non-local Gibbsianness” and CMI as quantifiable measures of learnability. The theoretical predictions are validated through experiments employing RNN, CNN, and Transformer architectures on systems exhibiting strong and weak spontaneous symmetry breaking, as well as topological codes—such as the surface code—under bit-flip noise.

Technology Category

Application Category

📝 Abstract
We identify fundamental limitations in machine learning by demonstrating that non-trivial mixed-state phases of matter are computationally hard to learn. Focusing on unsupervised learning of distributions, we show that autoregressive neural networks fail to learn global properties of distributions characterized by locally indistinguishable (LI) states. We demonstrate that conditional mutual information (CMI) is a useful diagnostic for LI: we show that for classical distributions, long-range CMI of a state implies a spatially LI partner. By introducing a restricted statistical query model, we prove that nontrivial phases with long-range CMI, such as strong-to-weak spontaneous symmetry breaking phases, are hard to learn. We validate our claims by using recurrent, convolutional, and Transformer neural networks to learn the syndrome and physical distributions of toric/surface code under bit flip noise. Our findings suggest hardness of learning as a diagnostic tool for detecting mixed-state phases and transitions and error-correction thresholds, and they suggest CMI and more generally ``non-local Gibbsness''as metrics for how hard a distribution is to learn.
Problem

Research questions and friction points this paper is trying to address.

unlearnable phases
mixed-state phases
locally indistinguishable states
conditional mutual information
statistical query model
Innovation

Methods, ideas, or system contributions that make the work stand out.

unlearnable phases
conditional mutual information
locally indistinguishable states
statistical query model
non-local Gibbsness
🔎 Similar Papers
No similar papers found.
T
Tarun Advaith Kumar
Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5, Canada; Department of Physics and Astronomy, University of Waterloo, Ontario, N2L 3G1, Canada
Yijian Zou
Yijian Zou
Perimeter Institute
Tensor networkCondensed matter theory
A
Amir-Reza Negari
Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5, Canada; Department of Physics and Astronomy, University of Waterloo, Ontario, N2L 3G1, Canada
Roger G. Melko
Roger G. Melko
Faculty, University of Waterloo & Perimeter Institute; Affiliate, Vector Institute
Condensed Matter TheoryQuantum InformationMachine Learning
T
Timothy H. Hsieh
Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5, Canada; Department of Physics and Astronomy, University of Waterloo, Ontario, N2L 3G1, Canada