## Section: New Results

### Mean field approaches

#### Confronting mean-field theories to measurements: a perspective from neuroscience

Participant : Bruno Cessac.

Mean-field theories in neuroscience are usually understood as ways to bridge spatial and temporal scales by lumping together the activities of many single neurons, and then explaining or predicting the spatio-temporal variations of mesoscopic or macroscopic quantities measurable with current technologies: EEG, MEG, fMRI, optical imaging, etc. This is very much alike the situation in statistical physics where macroscopic quantities such as pressure, conductivity and so on are explained by the interactions between ”microscopic” entities like atoms or molecules.

The situation in neuroscience is different however: the laws governing the microscopic dynamics in physics do not have the same structure as the laws governing neuronal dynamics; for example, interactions between neurons are not symmetric. Moreover, it is yet unclear what the relevant macroscopic quantities are in order to account for, say, visual perception. At the present stage of research, these quantities are considered to be what is measurable with currently available technologies, whereas better theories could reveal new types of phenomenological observables with a higher explanatory power.

We review mean-field methods coming from physics and their consequences on neuronal dynamics predictions.

This work is available as [30] , [31] , [32] .

#### A Formalism for Evaluating Analytically the Cross-Correlation Structure of a Firing-Rate Network Model

Participants : Diego Fasoli, Olivier Faugeras, Stefano Panzeri.

We introduce a new formalism for evaluating analytically the cross-correlation structure of a finite size firing-rate network with recurrent connections. The analysis performs a first-order perturbative expansion of neural activity equations that include three different sources of randomness: the background noise of the membrane potentials, their initial conditions, and the distribution of the recurrent synaptic weights. This allows the analytical quantification of the relationship between anatomical and functional connectivity, i.e. of how the synaptic connections determine the statistical dependencies at any order among different neurons. The technique we develop is general, but for simplicity and clarity we demonstrate its efficacy by applying it to the case of synaptic connections described by regular graphs. The analytical equations obtained in this way reveal previously unknown behaviors of recurrent firing-rate networks, especially on how correlations are modified by the external input, by the finite size of the network, by the density of the anatomical connections and by correlation in sources of randomness. In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs. Moreover we prove that in general it is not possible to find a mean-field description à la Sznitman of the network, if the anatomical connections are too sparse or our three sources of variability are correlated. To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent. Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.

This work is available as [22] .

#### Asymptotic Description of Neural Networks with Correlated Synaptic Weights

Participants : Olivier Faugeras, James Maclaurin.

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.

This work is available as [23] .

#### Clarification and Complement to "Mean-Field Description and Propagation of Chaos in Networks of Hodgkin-Huxley and FitzHugh-Nagumo Neurons"

Participants : Mireille Bossy, Olivier Faugeras, Denis Talay.

In this work, we clarify the well-posedness of the limit equations to the mean-field N-neuron models proposed in [1] and we prove the associated propagation of chaos property. We also complete the modeling issue in [1] by discussing the well-posedness of the stochastic differential equations which govern the behavior of the ion channels and the amount of available neurotransmitters.

This work is available as [18] .