DPG Phi
Verhandlungen
Verhandlungen
DPG

Bonn 2025 – scientific programme

Parts | Days | Selection | Search | Updates | Downloads | Help

QI: Fachverband Quanteninformation

QI 36: Poster – Quantum Information (joint session QI/Q)

QI 36.61: Poster

Thursday, March 13, 2025, 17:00–19:00, Tent

Is Localization a security threat in Quantum Machine Learning? — •Yannick Werner1, Nikolaos Palaiodimopoulos1,2, Omid Faizy2,3, Nico Piatkowski4, Paul Lukowicz1,2, and Maximilian Kiefer-Emmanouilidis1,21DFKI Kaiserslautern — 2RPTU Kaiserslautern-Landau — 3Sorbonne Université, Paris — 4Fraunhofer IAIS, Sankt Augustin

As Quantum Machine Learning (QML) becomes more developed and widely used in commercial applications, addressing its security risks is essential. We examine Quantum Neural Networks (QNNs) as disordered quantum systems to explore whether effects like Many-Body Localization (MBL) could impact QNN tasks such as classifying or generating data. It has been shown, that applying a simple cyclic permutation after embedding the data and before readout can recover complex classical data from the measurements of a single disorder realization [1]. This suggests that a trained QNN, which effectively represents such a single disorder realization, could be vulnerable to exposing sensitive data it is supposed to classify. For instance, an eavesdropper might recover sensitive input data from stolen measurement results, a risk that is non-existent with classical classifiers. To address this, we analyse shallow variational quantum circuits with nearest-neighbour interactions and strongly varying weights, where MBL dynamics are expected. We assess their vulnerability to data recovery and examine the balance between expressibility, trainability, and security risks in QNN designs.

[1]arXiv:2409.16180v1 (2024).

Keywords: Disordered Systems; Machine Learning; Security; Localization; Quantum Neural Networks

100% | Mobile Layout | Deutsche Version | Contact/Imprint/Privacy
DPG-Physik > DPG-Verhandlungen > 2025 > Bonn