DPG Phi
Verhandlungen
Verhandlungen
DPG

Regensburg 2025 – scientific programme

Parts | Days | Selection | Search | Updates | Downloads | Help

SOE: Fachverband Physik sozio-ökonomischer Systeme

SOE 7: Focus Session: Self-Regulating and Learning Systems: from Neural to Social Networks

SOE 7.9: Talk

Wednesday, March 19, 2025, 12:15–12:30, H45

Employing normalizing flows to examine neural manifold characteristics and curvatures — •Peter Bouss1,2, Sandra Nestler3, Kirsten Fischer1,2, Claudia Merger4, Alexandre René2,5, and Moritz Helias1,21IAS-6, Forschungszentrum Jülich, Germany — 2RWTH Aachen University, Germany — 3Technion, Haifa, Israel — 4SISSA, Trieste, Italy — 5University of Ottawa, Canada

Despite the vast number of active neurons, neuronal population activity supposedly lies on low-dimensional manifolds (Gallego et al., 2017). To learn the statistics of neural activity, we use Normalizing Flows (NFs) (Dinh et al., 2014). These neural networks are trained to estimate the probability distribution by learning an invertible map to a latent distribution.

We adjust NF’s training objectives to distinguish between relevant and noise dimensions, by using a nested dropout procedure in the latent space (Bekasov & Murray, 2020). An approximation of the network for each mixture component as a quadratic mapping enables us to calculate the Riemannian curvature tensors of the neural manifold. We focus mainly on the directions in the tangent space, in which the sectional curvature shows local extrema.

Finally, we apply the method to electrophysiological recordings of the visual cortex in macaques (Chen et al., 2022). We show that manifolds deviate significantly from being flat. Analyzing the curvature of the manifolds yields insights into the regimes where neuron groups interact in a non-linear manner.

Keywords: Manifolds; Curvature; Normalizing Flows; Dimensionality Reduction

100% | Mobile Layout | Deutsche Version | Contact/Imprint/Privacy
DPG-Physik > DPG-Verhandlungen > 2025 > Regensburg