Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
BP: Fachverband Biologische Physik
BP 23: Focus Session: Inference Methods and Biological Data (German-French Focus Session) (joint session BP/DY)
BP 23.8: Vortrag
Mittwoch, 20. März 2024, 17:30–17:45, H 2032
Information rates of neural activity on varying time scales — •Tobias Kühn and Ulisse Ferrari — Institut de la Vision, Sorbonne Université, CNRS, INSERM
Evaluating electrophysiological recordings, time is normally discretized in bins. If one aims at determining the information rate, i.e. the mutual information per time, the time-bin size has to be chosen with care because the result will appreciably depend on it. The framework we suggest gives freedom in this choice because our single-neuron model is not restricted to a binary representation of neural activity - as is the case for Ising-like models of neural networks.
Our method allows to faithfully estimate the entropy of the neural activity and eventually the mutual information between neural activity and stimulus for a given time scale. Like in the Ising model, we restrict ourselves to pairwise interactions, so that we just need the mean activities and the covariances (across neurons or across time) to compute entropies. This estimate requires a number of measures growing only quadratically in the number of neurons, as opposed to the exponential growth associated to the estimate of the full probability distribution, which prohibits using the latter for real data. More concretely, to compute entropies, we use a small-correlation expansion, expressed in a novel diagrammatic framework (Kühn & van Wijland 2023), avoiding the explicit inference or even a concrete choice of a single-neuron model. Our approach enables studying the dependence of information rate on the time scale on which the information is registered, which is crucial to understand how dynamic stimuli are processed.
Keywords: Mutual Information; Neuroscience; Retina; Feynman diagrams; Information rate