Berlin 2024 – wissenschaftliches Programm
Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
DY: Fachverband Dynamik und Statistische Physik
DY 34: Poster: Machine Learning, Data Science, and Reservoir Computing
DY 34.5: Poster
Mittwoch, 20. März 2024, 15:00–18:00, Poster C
Understanding Neural Network Models for Phase Recognition — •Shashank Kallappara, Janett Prehl, and Martin Weigel — Institut für Physik, Technische Universität Chemnitz, Chemnitz, Germany
The Ising model is one of the best-known models in statistical physics, undergoing a phase transition in dimensions d > 2, described by a simple order parameter: its magnetisation. Machine learning techniques have been successfully used in physics for classifying phases of different physical systems. Fully connected neural networks have been shown to learn the translational invariance of the Ising model when learning its phases using only a single hidden layer; analytic solutions for the same exist for highly compact networks that are constructed to obey the translational invariance automatically. Here, we show this learning of the invariance in single-layer networks of different widths and compare the networks’ performance in classifying the phases. We also consider a highly compact network but focus on the gradient descent learning dynamics over its loss landscape; we suggest a few changes to this that greatly improve its performance while preserving interpretability.
Keywords: Ising Model; Neural Network; Phase Transition