Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
QI: Fachverband Quanteninformation
QI 9: Quantum Machine Learning and Classical Simulability
QI 9.11: Vortrag
Dienstag, 19. März 2024, 12:30–12:45, HFT-FT 101
Understanding quantum machine learning also requires rethinking generalization — •Elies Gil-Fuster1,2, Jens Eisert1,2,3, and Carlos Bravo-Prieto1 — 1Dahlem Center for Complex Quantum Systems, Freie Universitat Berlin — 2Fraunhofer Heinrich Hertz Institute, Berlin — 3Helmholtz-Zentrum Berlin fur Materialien und Energie
Quantum machine learning models have shown successful generalization performance even when trained with few data. In this work, through systematic randomization experiments, we show that traditional approaches to understanding generalization fail to explain the behavior of such quantum models. Our experiments reveal that state-of-the-art quantum neural networks accurately fit random states and random labeling of training data. This ability to memorize random data defies current notions of small generalization error, problematizing approaches that build on complexity measures such as the VC dimension, the Rademacher complexity, and all their uniform relatives. We complement our empirical results with a theoretical construction showing that quantum neural networks can fit arbitrary labels to quantum states, hinting at their memorization ability. Our results do not preclude the possibility of good generalization with few training data but rather rule out any possible guarantees based only on the properties of the model family. These findings expose a fundamental challenge in the conventional understanding of generalization in quantum machine learning and highlight the need for a paradigm shift in the design of quantum models for machine learning tasks.
Keywords: supervised learning; learning theory; quantum phase recognition; generalization