Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
DY: Fachverband Dynamik und Statistische Physik
DY 33: Machine Learning in Dynamics and Statistical Physics I
DY 33.3: Vortrag
Donnerstag, 20. März 2025, 10:00–10:15, H47
Self-Organizing Global Computation from Local Objective Functions Based on Partial Information Decomposition — Andreas C. Schneider1,2, •Valentin Neuhaus2,1, David A. Ehrlich3, Abdullah Makkeh3, Alexander S. Ecker4,2, Viola Priesemann2,1, and Michael Wibral3 — 1Institute for the Dynamics of Complex Systems, University of Göttingen, Germany — 2Max Planck Institute for Dynamics and Self Organisation, Göttingen, German — 3Campus Institute for Dynamics of Biological Networks, University of Göttingen — 4Institute of Computer Science and Campus Institute Data Science, University of Göttingen
In modern deep neural networks, individual neuron learning dynamics are often obscure due to global optimization. In contrast, biological systems use self-organized, local learning to achieve robustness and efficiency with limited global information. We propose a method for achieving self-organization in artificial neurons by defining local learning goals based on information theory. These goals leverage Partial Information Decomposition (PID), which breaks down information from sources into unique, redundant, and synergistic contributions. Our framework enables neurons to locally determine how input classes contribute to the output, expressed as a weighted sum of PID terms derived from intuition or numerical optimization. This approach enhances task-relevant local information processing and neuron-level interpretability while maintaining strong performance, providing a principled foundation for local learning strategies.
Keywords: Partial Information Decomposition; Information Theory; Local learning