SKM 2023 – scientific programme
Parts | Days | Selection | Search | Updates | Downloads | Help
DY: Fachverband Dynamik und Statistische Physik
DY 33: Biologically Inspired Statistical Physics (joint session DY/BP)
DY 33.5: Talk
Wednesday, March 29, 2023, 16:00–16:15, ZEU 250
Quantifying information content in continuous attractor networks — •Tobias Kühn1,2 and Rémi Monasson1 — 1Laboratoire de Physique de l'Ecole Normale Supérieure, ENS, Université PSL, CNRS, Sorbonne Université, Université Paris Cité, F-75005 Paris — 2Institut de la Vision, Sorbonne Université, INSERM, CNRS, F-75012 Paris
Attractor networks are a theme with long tradition to model information storage in the brain. Continuous attractor neural networks (CANN), in particular, have been employed to describe the storage of information about space and orientation. However, it stays controversial how useful this paradigm really is to explain actual processes, for example the representation of space in grid and place cells in the entorhinal cortex and the hippocampus, respectively.
A common criticism is that the disorder present in the connections might deteriorate the system's capability to reliably preserve the information of a certain pattern. In order to investigate if this criticism is valid, a measure is needed to objectively quantify the information content of a given neural network. Using the replica-trick, we compute the Fisher information for a network receiving space-dependent input whose connections are composed of a distance-dependent and a disordered component. We observe that the decay of the Fisher information is slow for not too large disorder strength, indicating that CANNs have a regime in which the advantageous effects of connectivity on information storage outweigh the detrimental ones.