Berlin 2024 – wissenschaftliches Programm
Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
DY: Fachverband Dynamik und Statistische Physik
DY 34: Poster: Machine Learning, Data Science, and Reservoir Computing
DY 34.10: Poster
Mittwoch, 20. März 2024, 15:00–18:00, Poster C
Phase Transitions and Information Flow in Deep Neural Networks — •Ibrahim Talha Ersoy1 and Karoline Wiesner2 — 1Universität Potsdam, Institut für Astronomie und Physik, Potsdam, Deutschland — 2Universität Potsdam, Institut für Astronomie und Physik, Potsdam, Deutschland
The learning process of neural networks (NNs) can be characterized as an optimization toward specific balance between complexity of the representa- tion and precision, both of which can be measured by the mutual informa- tion. This observation as well as the so called information bottleneck (IB) approach where one restricts complexity goes back Tishby et al. [1,2]. In the IB approach two mutual information terms are built into the loss function, with some trade-off parameter giving their balance. It was observed that the system undergoes a number of second order phase transitions when varying this parameter [3]. We utilize this feature to better understand the change between different model representations. A close connection to schemes like the variational autoencoder and other networks with variable regularizes has been suggested [5]. We investigated these claims and make a number of trials and theoretical considerations to make this claim manifest. In particular we probed the dependencies of hidden representations and the features they represent. We also compared the compression behaviour of the NN-input to other methods like PCA, Diffusion maps and t-SNE. For gaussian data we see a strong connection between the VAE, IB as well as PCA, as expected.
Keywords: Deep Learning; Neural Networks; Information Bottleneck; Phase Transition; Variational Autoencoder