DPG Phi
Verhandlungen
Verhandlungen
DPG

Berlin 2024 – scientific programme

Parts | Days | Selection | Search | Updates | Downloads | Help

DY: Fachverband Dynamik und Statistische Physik

DY 19: Machine Learning in Dynamics and Statistical Physics II (joint session DY/SOE)

DY 19.10: Talk

Tuesday, March 19, 2024, 12:00–12:15, BH-N 243

Loss is More: Exploring the weight space of a perceptron via enhanced sampling techniques — •Margherita Mele1, Roberto Menichetti1, Alessandro Ingrosso2, and Raffaello Potestio11Physics Department, University of Trento, via Sommarive, 14 I-38123 Trento, Italy — 2The Abdus Salam International Centre for Theoretical Physics (ICTP), Trieste, Italy

Understanding how input data properties influence the learning process in artificial networks is crucial. The assumption of Gaussian i.i.d. inputs has long been foundational, yet questioning its constraints is now essential. Our approach utilises enhanced sampling methods from soft matter physics to exhaustively explore the loss profile and reconstruct the density of states of networks with discrete weights, addressing optimization in highly rugged landscapes even in simple architectures. These methods, effective in real datasets, enable exploration of data dimensionality and structure impact.

Employing benchmarks (e.g. MNIST, FashionMNIST) and in silico datasets, our study investigates the role of various input-data properties, including class imbalance, separation, item mislabelled, and input-output correlation. Our findings bridge theoretical and applied aspects, shedding light on the limitations and extensions of Gaussian i.i.d. assumptions. This work provides pivotal insights into the interplay between input data properties and network learning, advancing our understanding of how artificial networks adapt to different information contexts.

Keywords: neural network; enhanced sampling; entropy

100% | Mobile Layout | Deutsche Version | Contact/Imprint/Privacy
DPG-Physik > DPG-Verhandlungen > 2024 > Berlin