SMuK 2023 – wissenschaftliches Programm
Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
MP: Fachverband Theoretische und Mathematische Grundlagen der Physik
MP 6: AI Topical Day – Neural Networks and Computational Complexity (joint session MP/AKPIK)
MP 6.2: Hauptvortrag
Mittwoch, 22. März 2023, 11:30–12:00, ZEU/0250
Deep neural networks and the renormalization group — •Ro Jefferson1, Johanna Erdmenger2, and Kevin Grosvenor3 — 1Utrecht University — 2University of Würzburg — 3Leiden University
Despite the success of deep neural networks (DNNs) on an impressive range of tasks, they are generally treated as black boxes, with performance relying on heuristics and trial-and-error rather than any explanatory theoretical framework. Recently however, techniques and ideas from physics have been applied to DNNs in the hopes of distilling the underlying fundamental principles. In this talk, I will discuss some interesting parallels between DNNs and the renormalization group (RG). I will briefly reivew RG in the context of a simple lattice model, where subsequent RG steps are analogous to subsequent layers in a DNN, in that effective interactions arise after marginalizing hidden degrees of freedom/neurons. I will then quantify the intuitive idea that information is lost along the RG flow by computing the relative entropy in both the Ising model and a feedforward DNN. One finds qualitatively identical behaviour in both systems, in which the relative entropy increases monotonically to some asymptotic value. On the QFT side, this confirms the link between relative entropy and the c-theorem, while for machine learning, it may have implications for various information maximization methods, as well as disentangling compactness and generalizability.