Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
SOE: Fachverband Physik sozio-ökonomischer Systeme
SOE 8: Machine Learning in Dynamics and Statistical Physics II (joint session DY/SOE)
SOE 8.11: Vortrag
Dienstag, 19. März 2024, 12:15–12:30, BH-N 243
Emergent oscillating dimensionality transformations in deep learning — •Pascal de Jong, Felix J. Meigel, and Steffen Rulands — Arnold Sommerfeld Center for Theoretical Physics, Department of Physics, Ludwig- Maximilians-Universität, München, Germany
Artificial intelligence relies on deep neural networks (DNNs), which comprise a large set of nonlinear nodes connected by weights. The functioning of DNNs and their ability to generalize to unseen data are examples of complex behavior, which due to their highly nonlinear nature is poorly understood. Here, we show that training DNNs universally leads to oscillating weight topologies that alter the embedding dimensions of hidden data representations in different layers. Specifically, using a path representation of DNNs, we derive equations for the time evolution of the weights. We show that training leads to a structure, in which weights are focused on a subset of nodes, and the degree of focusing oscillates across layers. We empirically confirm these findings by studying the training dynamics of large DNNs on different data sets. Finally, we show that these structures imply a repeated dimensional decrease and increase of the hidden data representations. Our results highlight that emergent dynamics during training can lead to universal network topologies with implications for their function.
Keywords: Deep neural networks; Emergence; Structure formation