Team NeuroMathComp

Members
Overall Objectives
Scientific Foundations
Software
New Results
Other Grants and Activities
Dissemination
Bibliography

Section: New Results

Modeling and simulating assemblies of neurons

Back-engineering of spiking neural networks parameters

Participants : Bruno Cessac, Horacio Rostro-Gonzalez, Thierry Viéville.

We consider the deterministic evolution of a time-discretized spiking network of neurons with connection weights having delays, modeled as a discretized neural network of the generalized integrate and fire (gIF) type. The purpose is to study a class of algorithmic methods allowing to calculate the proper parameters to reproduce exactly a given spike train generated by an hidden (unknown) neural network. This standard problem is known as NP-hard when delays are to be calculated. We propose here a reformulation, now expressed as a Linear-Programming (LP) problem, thus allowing to provide an efficient resolution. This allows us to "back-engineer" a neural network, i.e., to find out, given a set of initial conditions, which parameters (i.e., connection weights in this case), allow to simulate the network spike dynamics. More precisely we make explicit the fact that the back-engineering of a spike train, is a Linear (L) problem if the membrane potentials are observed and a LP problem if only spike times are observed, with a gIF model. Numerical robustness is discussed. We also explain how it is the use of a generalized IF neuron model instead of a leaky IF model that allows us to derive this algorithm. Furthermore, we point out how the L or LP adjustment mechanism is local to each unit and has the same structure as an "Hebbian" rule. A step further, this paradigm is easily generalizable to the design of input-output spike train transformations. This means that we have a practical method to "program" a spiking network, i.e., find a set of parameters allowing us to exactly reproduce the network output, given an input. Numerical verifications and illustrations are provided.

This work has appeared in BMC Neuroscience [22]

This work was supported by the ARC MACACC.

How Gibbs distributions may naturally arise from synaptic adaptation mechanisms. A model-based argumentation

Participants : Bruno Cessac, Horacio Rostro-Gonzalez, Juan-Carlos Vasquez, Thierry Viéville.

This paper addresses two questions in the context of neuronal networks dynamics, using methods from dynamical systems theory and statistical physics: (i) How to characterize the statistical properties of sequences of action potentials ("spike trains") produced by neuronal networks ? and; (ii) what are the effects of synaptic plasticity on these statistics ? We introduce a framework in which spike trains are associated to a coding of membrane potential trajectories, and actually, constitute a symbolic coding in important explicit examples (the so-called gIF models). On this basis, we use the thermodynamic formalism from ergodic theory to show how Gibbs distributions are natural probability measures to describe the statistics of spike trains, given the empirical averages of prescribed quantities. As a second result, we show that Gibbs distributions naturally arise when considering "slow" synaptic plasticity rules where the characteristic time for synapse adaptation is quite longer than the characteristic time for neurons dynamics.

This work has appeared in Journal of Statistical Physics [14]

This work was supported by the ARC MACACC.

Indisputable facts when implementing spiking neuron networks

Participants : Bruno Cessac, Hélène Paugam-Moisy, Thierry Viéville.

In this article, our wish is to demystify some aspects of coding with spike-timing, through a simple review of well-understood technical facts regarding spike coding. The goal is to help better understanding to which extend computing and modelling with spiking neuron networks can be biologically plausible and computationally efficient. We intentionally restrict ourselves to a deterministic dynamics, in this review, and we consider that the dynamics of the network is defined by a non-stochastic mapping. This allows us to stay in a rather simple framework and to propose a review with concrete numerical values, results and formula on (i) general time constraints, (ii) links between continuous signals and spike trains, (iii) spiking networks parameter adjustments. When implementing spiking neuron networks, for computational or biological simulation purposes, it is important to take into account the indisputable facts here reviewed. This precaution could prevent from implementing mechanisms meaningless with regards to obvious time constraints, or from introducing spikes artificially, when continuous calculations would be sufficient and simpler. It is also pointed out that implementing a spiking neuron network is finally a simple task, unless complex neural codes are considered.

This work has been accepted in Journal of Physiology, Paris [13] (in press)

This work was supported by the ARC MACACC.

A view of Neural Networks as dynamical systems

Participant : Bruno Cessac.

We consider neural networks from the point of view of dynamical systems theory. In this spirit we review recent results dealing with the following questions, adressed in the context of specific models. 1. Characterizing the collective dynamics; 2. Statistical analysis of spikes trains; 3. Interplay between dynamics and network structure; 4. Effects of synaptic plasticity. This work has been accepted in International Journal of Bifurcations and Chaos [12] (in press)

This work was supported by the ARC MACACC.

A constructive mean-field analysis of multi population neural networks with random synaptic weights and stochastic inputs

Participants : Bruno Cessac, Olivier Faugeras, Jonathan Touboul.

We deal with the problem of bridging the gap between two scales in neuronal modeling. At the first (microscopic) scale, neurons are considered individually and their behavior described by stochastic differential equations that govern the time variations of their membrane potentials. They are coupled by synaptic connections acting on their resulting activity, a nonlinear function of their membrane potential. At the second (mesoscopic) scale, interacting populations of neurons are described individually by similar equations. The equations describing the dynamical and the stationary mean-field behaviors are considered as functional equations on a set of stochastic processes. Using this new point of view allows us to prove that these equations are well-posed on any finite time interval and to provide, by a fixed point method, a constructive method for effectively computing their unique solution. This method is proved to converge to the unique solution and we characterize its complexity and convergence rate. We also provide partial results for the stationary problem on infinite time intervals. These results shed some new light on such neural mass models as the one of Jansen and Rit [44] , their dynamics appears as a coarse approximation of the much richer dynamics that emerges from our analysis. Our numerical experiments confirm that the framework we propose and the numerical methods we derive from it provide a new and powerful tool for the exploration of neural behaviors at different scales.

This work has appeared in Frontiers in Neuroscience  [18]

This work was partially supported by the EC IP project FP6-015879, FACETS, the ERC advanced grant NerVi number 227747, and the Fondation d'Entreprise EADS.

A Markovian event-based framework for stochastic spiking neural networks

Participants : Olivier Faugeras, Jonathan Touboul.

We introduce and study a mathematical framework for characterizing and simulating networks of noisy integrate-and-fire neurons based on the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network.

We apply this modeling to different linear integrate-and-fire neuron models with or without noise synaptic integration, different types of synapses, with possibly transmission delays and absolute and relative refractory period, and this way generalize results previously obtained in certain particular cases . [46] , [49] This approach provides a powerful framework to study some properties of the network, and an extremely efficient way to simulate the dynamics of large networks. In particular, it allows a parallel implementation, that was implemented on GPU.

This work has appeared in Neural Computation and available on arXiv [36]

This work was partially funded by the ERC advanced grant NerVi number 227747.

Bifurcations of cycles, rhythms and epilepsy in neural mass models

Participants : Olivier Faugeras, Jonathan Touboul.

Temporal lobe epilepsy is one of the most common chronic neurological disorder characterized by the occurrence of spontaneous recurrent seizures which can be observed at the level of populations through electroencephalogram (EEG) recordings. The aim of this work is to understand from a theoretical viewpoint the occurrence of this type of seizures and the origin of the oscillatory activity in some classical cortical column models. We relate these rhythmic activities to the structure of the set of periodic orbits in the models, and therefore to their bifurcations. We will be mainly interested Jansen and Rit model, and study the codimension one, two and a codimension three bifurcations of equilibria and cycles of this model. We can therefore understand the effect of the different biological parameters of the system of the apparition of epileptiform activity and observe the emergence of alpha, delta and theta sleep waves in a certain range of parameter.

This work was partially supported by the EC IP project FP6-015879, FACETS, the ERC advanced grant NerVi number 227747, and the Fondation d'Entreprise EADS.

Neural Fields: stationary states

Keywords : Neural masses, Neural fields, Integro-differential equations, stationary solutions, persistent states, bumps.

Participants : Olivier Faugeras, François Grimbert, Romain Veltz.

Neural continuum networks are an important aspect of the modeling of macroscopic parts of the cortex. Two classes of such networks are considered: voltage- and activity-based. In both cases our networks contain an arbitrary number, n , of interacting neuron populations. Spatial non-symmetric connectivity functions represent cortico-cortical, local, connections, external inputs represent non-local connections. Sigmoidal nonlinearities model the relationship between (average) membrane potential and activity. Departing from most of the previous work in this area we do not assume the nonlinearity to be singular, i.e., represented by the discontinuous Heaviside function. Another important difference with previous work is our relaxing of the assumption that the domain of definition where we study these networks is infinite, i.e., equal to R or R 2 . We explicitly consider the biologically more relevant case of a bounded subset $ \upper_omega$ of R q , q= 1, 2, 3 , a better model of a piece of cortex. The time behaviour of these networks is described by systems of integro-differential equations. Using methods of functional analysis, we study the existence and uniqueness of a stationary, i.e., time-independent, solution of these equations in the case of a stationary input. These solutions can be seen as “persistent”, they are also sometimes called “bumps”. We show that under very mild assumptions on the connectivity functions and because we do not use the Heaviside function for the nonlinearities, such solutions always exist. We also give sufficient conditions on the connectivity functions for the solution to be absolutely stable, that is to say independent of the initial state of the network. We then study the sensitivity of the solution(s) to variations of such parameters as the connectivity functions, the sigmoids, the external inputs, and, last but not least, the shape of the domain of existence $ \upper_omega$ of the neural continuum networks. These theoretical results are illustrated and corroborated by a large number of numerical experiments in most of the cases 2 $ \le$ n $ \le$3, 2 $ \le$ q $ \le$3 .

This work has appeared in Neural Computation [19]

This work was partially supported by the EC IP project FP6-015879, FACETS, the ERC advanced grant NerVi number 227747, and the Fondation d'Entreprise EADS.

Local/global analysis of the stationary solutions of some neural field equations

Keywords : neural field equations, stationary solutions, bifurcation, Leray-Schauder degree, ring model.

Participants : Olivier Faugeras, Romain Veltz.

Neural or cortical fields are continuous assemblies of mesoscopic models, also called neural masses, of neural populations that are fundamental in the modeling of macroscopic parts of the brain. Neural fields are described by nonlinear integro-differential equations. The solutions of these equations represent the state of activity of these populations when submitted to inputs from neighbouring brain areas. Understanding the properties of these solutions is essential in advancing our understanding of the brain. In this paper we study the dependency of the stationary solutions of the neural fields equations with respect to the stiffness of the nonlinearity and the contrast of the external inputs. This is done by using degree theory and bifurcation theory in the context of functional, in particular infinite dimensional, spaces. The joint use of these two theories allows us to make new detailed predictions about the global and local behaviours of the solutions. We also provide a generic finite dimensional approximation of these equations which allows us to study in great details two models. The first model is a neural mass model of a cortical hypercolumn of orientation sensitive neurons, the ring model [47] . The second model is a general neural field model where the spatial connectivity is described by heterogeneous Gaussian-like functions.

This work has appeared in SIAM Journal on Applied Dynamical Systems, available on arXiv [37] .

This work was partially funded by the ERC advanced grant NerVi number 227747.

Some theoretical and numerical results for delay neural field equations

Keywords : Neural fields; nonlinear integro-differential equations; delays; Lyapunov functional; pattern formation; numerical schemes.

Participants : Olivier Faugeras, Grégory Faye.

In this paper we study neural fields models with delays which define a useful framework for modeling macroscopic parts of the cortex involving several populations of neurons. Nonlinear delayed integro-differential equations describe the spatio-temporal behavior of these fields. Using methods from the theory of delay differential equations, we show the existence and uniqueness of a solution of these equations. A Lyapunov analysis gives us sufficient conditions for the solutions to be asymptotically stable. We also present a fairly detailed study of the numerical computation of these solutions. This is, to our knowledge, the first time that a serious analysis of the problem of the existence and uniqueness of a solution of these equations has been performed. Another original contribution of ours is the definition of a Lyapunov functional and the result of stability it implies. We illustrate our numerical schemes on a variety of examples that are relevant to modeling in neuroscience.

This work has appeared in Journal Physica D and accepted in a special issue on mathematical neuroscience [20]

This work was partially funded by the ERC advanced grant NerVi number 227747.


previous
next

Logo Inria