Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe

DY: Fachverband Dynamik und Statistische Physik

DY 34: Poster: Machine Learning, Data Science, and Reservoir Computing

DY 34.4: Poster

Mittwoch, 20. März 2024, 15:00–18:00, Poster C

Physical interpretation of learning dynamics in neural networks — •Yannick Mühlhäuser1,2, Max Weinmann2,3, and Miriam Klopotek21University of Tübingen, Tübingen, Germany — 2University of Stuttgart, Stuttgart Center for Simulation Science, SimTech Cluster of Excellence EXC 2075, Stuttgart, Germany — 3University of Stuttgart, Interchange Forum for Reflecting on Intelligent Systems, IRIS3D, Stuttgart, Germany

Neural network-based machine learning methods are becoming ubiquitous for applications to physics and science. A key challenge for their seamless integration into science is their opacity, or “black-box-ness”. How they learn, i.e. their learning dynamics, can shed some light into their “reasoning” process. We look at the learning dynamics of autencoder-type neural networks trained via different optimization techniques [1]. We use statistical model systems for finding specific analogies to well-known phenomena from physics like phase transitions [2], offering a route towards interpretation.

[1] Borysenko, O., and Byshkin, M. (2021). CoolMomentum: A method for stochastic optimization by Langevin dynamics with simulated annealing. Scientific Reports, 11(1), 10705.
[2] Liu, Z., Kitouni, O., Nolte, N. S., Michaud, E., Tegmark, M., and Williams, M. (2022). Towards understanding grokking: An effective theory of representation learning. Advances in Neural Information Processing Systems, 35, 34651-34663.

Keywords: Machine Learning; Statistical Physics; Phase Transitions; Explainability; Learning Dynamics

100% | Bildschirmansicht | English Version | Kontakt/Impressum/Datenschutz
DPG-Physik > DPG-Verhandlungen > 2024 > Berlin