Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe

Q: Fachverband Quantenoptik und Photonik

Q 63: Poster – Quantum Information (joint session QI/Q)

Q 63.51: Poster

Donnerstag, 13. März 2025, 17:00–19:00, Tent

Quantum Generative Modelling with Conservation Law based Pretraining — •Akash Malemath1,2, Yannick Werner3, Paul Lukowicz1,3, and Maximilian Kiefer-Emmanouilidis1,2,31Department of Computer Science and Research Initiative QCAI, RPTU, Kaiserslautern-Landau — 2Department of Physics, RPTU, Kaiserslautern-Landau — 3DFKI Kaiserslautern

Abstract:

Compared to the recent advancements in classical generative AI, quantum generative models still lack the capability to generate complex data effectively. One of the greatest challenges in classical AI is developing systems that extract fundamental relationships from large datasets and encode them into suitable embeddings. In quantum generative AI, these concepts are still in early stages and are mostly learned using classical methods.

In this work, we evaluate embeddings inspired by conservation laws as a pretraining step, applying them to simple quantum generative models like the Quantum Circuit Born Machine (QCBM). This implicit generative model is well-suited for reproducing target distributions and is simple enough to demonstrate the benefits of pretraining. Specifically, we explore pretraining using the particle number distribution and system Hamiltonian within the QCBM, aiming to model target distributions with reduced effort. Our analysis of pretraining in QCBM focuses on its impact on model convergence and accuracy, using metrics such as Kullback-Leibler (KL) divergence, and compares pretrained models with those trained normally.

Keywords: Quantum Generative models; Quantum Circuit Born Machines (QCBM); Kullback-Leibler (KL) Divergence

100% | Bildschirmansicht | English Version | Kontakt/Impressum/Datenschutz
DPG-Physik > DPG-Verhandlungen > 2025 > Bonn