MathNeuro focuses on the applications of multi-scale dynamics to neuroscience. This involves the modelling and analysis of systems with multiple time scales and space scales, as well as stochastic effects. We look both at single-cell models, microcircuits and large networks. In terms of neuroscience, we are mainly interested in questions related to synaptic plasticity and neuronal excitability, in particular in the context of pathological states such as epileptic seizures and neurodegenerative diseases such as Alzheimer.

Our work is quite mathematical but we make heavy use of computers for numerical experiments and simulations. We have close ties with several top groups in biological neuroscience. We are pursuing the idea that the "unreasonable effectiveness of mathematics" can be brought, as it has been in physics, to bear on neuroscience.

Modeling such assemblies of neurons and simulating their behavior involves putting together a mixture of the most recent results in neurophysiology with such advanced mathematical methods as dynamical systems theory, bifurcation theory, probability theory, stochastic calculus, theoretical physics and statistics, as well as the use of simulation tools.

We conduct research in the following main areas:

Neural networks dynamics

Mean-field and stochastic approaches

Neural fields

Slow-fast dynamics in neuronal models

Modeling neuronal excitability

Synaptic plasticity

Visual neuroscience

The study of neural networks is certainly motivated by the long term goal to understand how brain is working. But, beyond the comprehension of brain or even of simpler neural systems in less evolved animals, there is also the desire to exhibit general mechanisms or principles at work in the nervous system. One possible strategy is to propose mathematical models of neural activity, at different space and time scales, depending on the type of phenomena under consideration. However, beyond the mere proposal of new models, which can rapidly result in a plethora, there is also a need to understand some fundamental keys ruling the behaviour of neural networks, and, from this, to extract new ideas that can be tested in real experiments. Therefore, there is a need to make a thorough analysis of these models. An efficient approach, developed in our team, consists of analysing neural networks as dynamical systems. This allows to address several issues. A first, natural issue is to ask about the (generic) dynamics exhibited by the system when control parameters vary. This naturally leads to analyse the bifurcations occurring in the network and which phenomenological parameters control these bifurcations. Another issue concerns the interplay between the neuron dynamics and the synaptic network structure.

Modeling neural activity at scales integrating the effect of thousands of neurons is of central importance for several reasons. First, most imaging techniques are not able to measure individual neuron activity (microscopic scale), but are instead measuring mesoscopic effects resulting from the activity of several hundreds to several hundreds of thousands of neurons. Second, anatomical data recorded in the cortex reveal the existence of structures, such as the cortical columns, with a diameter of about 50

Our group is developing mathematical and numerical methods allowing on one hand to produce dynamic mean-field equations from the physiological characteristics of neural structure (neurons type, synapse type and anatomical connectivity between neurons populations), and on the other so simulate these equations; see Figure . These methods use tools from advanced probability theory such as the theory of Large Deviations and the study of interacting diffusions .

Neural fields are a phenomenological way of describing the activity of population of neurons by delayed integro-differential equations. This continuous approximation turns out to be very useful to model large brain areas such as those involved in visual perception. The mathematical properties of these equations and their solutions are still imperfectly known, in particular in the presence of delays, different time scales and noise.

Our group is developing mathematical and numerical methods for analysing these equations. These methods are based upon techniques from mathematical functional analysis, bifurcation theory , , equivariant bifurcation analysis, delay equations, and stochastic partial differential equations. We have been able to characterize the solutions of these neural fields equations and their bifurcations, apply and expand the theory to account for such perceptual phenomena as edge, texture , and motion perception. We have also developed a theory of the delayed neural fields equations, in particular in the case of constant delays and propagation delays that must be taken into account when attempting to model large size cortical areas , . This theory is based on center manifold and normal forms ideas .

Neuronal rhythms typically display many different timescales, therefore it is important to incorporate this slow-fast aspect in models. We are interested in this modeling paradigm where slow-fast point models, using Ordinary Differential Equations (ODEs), are investigated in terms of their bifurcation structure and the patterns of oscillatory solutions that they can produce. To insight into the dynamics of such systems, we use a mix of theoretical techniques — such as geometric desingularisation and centre manifold reduction — and numerical methods such as pseudo-arclength continuation . We are interested in families of complex oscillations generated by both mathematical and biophysical models of neurons. In particular, so-called *mixed-mode oscillations (MMOs)* , , , which represent an alternation between subthreshold and spiking behaviour, and *bursting oscillations* , , also corresponding to experimentally observed behaviour ; see Figure . We are working on extending these results to spatio-temporal neural models .

Excitability refers to the all-or-none property of neurons , . That is, the ability to respond nonlinearly to an input with a dramatic change of response from “none” — no response except a small perturbation that returns to equilibrium — to “all” — large response with the generation of an action potential or spike before the neuron returns to equilibrium. The return to equilibrium may also be an oscillatory motion of small amplitude; in this case, one speaks of resonator neurons as opposed to integrator neurons. The combination of a spike followed by subthreshold oscillations is then often referred to as mixed-mode oscillations (MMOs) . Slow-fast ODE models of dimension at least three are well capable of reproducing such complex neural oscillations. Part of our research expertise is to analyse the possible transitions between different complex oscillatory patterns of this sort upon input change and, in mathematical terms, this corresponds to understanding the bifurcation structure of the model. Furthermore, the shape of time series of this sort with a given oscillatory pattern can be analysed within the mathematical framework of dynamic bifurcations; see the section on slow-fast dynamics in Neuronal Models. The main example of abnormal neuronal excitability is hyperexcitability and it is important to understand the biological factors which lead to such excess of excitability and to identify (both in detailed biophysical models and reduced phenomenological ones) the mathematical structures leading to these anomalies. Hyperexcitability is one important trigger for pathological brain states related to various diseases such as chronic migraine , epilepsy or even Alzheimer's Disease . A central central axis of research within our group is to revisit models of such pathological scenarios, in relation with a combination of advanced mathematical tools and in partnership with biological labs.

Neural networks show amazing abilities to evolve and adapt, and to store and process information. These capabilities are mainly conditioned by plasticity mechanisms, and especially synaptic plasticity, inducing a mutual coupling between network structure and neuron dynamics. Synaptic plasticity occurs at many levels of organization and time scales in the nervous system . It is of course involved in memory and learning mechanisms, but it also alters excitability of brain areas and regulates behavioral states (e.g., transition between sleep and wakeful activity). Therefore, understanding the effects of synaptic plasticity on neurons dynamics is a crucial challenge.

Our group is developing mathematical and numerical methods to analyse this mutual interaction. On the one hand, we have shown that plasticity mechanisms , , Hebbian-like or STDP, have strong effects on neuron dynamics complexity, such as synaptic and propagation delays , dynamics complexity reduction, and spike statistics.

Ariane Delrocq received on 29th November 2019, a price from Ecole Polytechnique Paris for her research internship co-supervised by E. Deval and R. Veltz.

Metastability refers to the fact that the state of a dynamical system spends a large amount of time in a restricted region of its available phase space before a transition takes place, bringing the system into another state from where it might recur into the previous one. Beim Graben and Hutt (2013) suggested to use the recurrence plot (RP) technique introduced by Eckmann et al. (1987) for the segmentation of system's trajectories into metastable states using recurrence grammars. Here, we apply this recurrence structure analysis (RSA) for the first time to resting-state brain dynamics obtained from functional magnetic resonance imaging (fMRI). Brain regions are defined according to the brain hierarchical atlas (BHA) developed by Diez et al. (2015) , and as a consequence, regions present high-connectivity in both structure (obtained from diffusion tensor imaging) and function (from the blood-level dependent-oxygenation–BOLD–signal). Remarkably, regions observed by Diez et al. were completely time-invariant. Here, in order to compare this static picture with the metastable systems dynamics obtained from the RSA segmentation, we determine the number of metastable states as a measure of complexity for all subjects and for region numbers varying from 3 to 100. We find RSA convergence toward an optimal segmentation of 40 metastable states for normalized BOLD signals, averaged over BHA modules. Next, we build a bistable dynamics at population level by pooling 30 subjects after Hausdorff clustering. In link with this finding, we reflect on the different modeling frameworks that can allow for such scenarios: heteroclinic dynamics, dynamics with riddled basins of attraction, multiple timescale dynamics. Finally, we characterize the metastable states both functionally and structurally, using templates for resting state networks (RSNs) and the automated anatomical labeling (AAL) atlas, respectively.

This work has been published in Frontiers in Computational Neuroscience and is available as .

Information transmission in the human brain is a fundamentally dynamic network process. In partial epilepsy, this process is perturbed and highly synchronous seizures originate in a local network, the so-called epileptogenic zone (EZ), before recruiting other close or distant brain regions. We studied patient-specific brain network models of 15 drug-resistant epilepsy patients with implanted stereotactic electroencephalography (SEEG) electrodes. Each personalized brain model was derived from structural data of magnetic resonance imaging (MRI) and diffusion tensor weighted imaging (DTI), comprising 88 nodes equipped with region specific neural mass models capable of demonstrating a range of epileptiform discharges. Each patients virtual brain was further personalized through the integration of the clinically hypothesized EZ. Subsequent simulations and connectivity modulations were performed and uncovered a finite repertoire of seizure propagation patterns. Across patients, we found that (i) patient-specific network connectivity is predictive for the subsequent seizure propagation pattern; (ii) seizure propagation is characterized by a systematic sequence of brain states; (iii) propagation can be controlled by an optimal intervention on the connectivity matrix; (iv) the degree of invasiveness can be significantly reduced via the here proposed seizure control as compared to traditional resective surgery. To stop seizures, neurosurgeons typically resect the EZ completely. We showed that stability analysis of the network dynamics using graph theoretical metrics estimates reliably the spatiotemporal properties of seizure propagation. This suggests novel less invasive paradigms of surgical interventions to treat and manage partial epilepsy.

This work has been published in PLoS Computational Biology and is available as .

We analyse the possible dynamical states emerging for two symmetrically pulse coupled populations of leaky integrate-and-fire neurons. In particular, we observe broken symmetry states in this set-up: namely, breathing chimeras, where one population is fully synchronized and the other is in a state of partial synchronization (PS) as well as generalized chimera states, where both populations are in PS, but with different levels of synchronization. Symmetric macroscopic states are also present, ranging from quasi-periodic motions, to collective chaos, from splay states to population anti-phase partial synchronization. We then investigate the influence disorder, random link removal or noise, on the dynamics of collective solutions in this model. As a result, we observe that broken symmetry chimera-like states, with both populations partially synchronized, persist up to 80 % of broken links and up to noise amplitudes 8 % of threshold-reset distance. Furthermore, the introduction of disorder on symmetric chaotic state has a constructive effect, namely to induce the emergence of chimera-like states at intermediate dilution or noise level.

This work has been published as a chapter in the book Nonlinear Dynamics in Computational Neuroscience (Springer, 2019) and is available as .

We study the synchronization and stability of power grids within the Kuramoto phase oscillator model with inertia with a bimodal frequency distribution representing the generators and the loads. We identify critical nodes through solitary frequency deviations and Lyapunov vectors corresponding to unstable Lyapunov exponents. To cure dangerous deviations from synchronization we propose time-delayed feedback control, which is an efficient control concept in nonlinear dynamic systems. Different control strategies are tested and compared with respect to the minimum number of controlled nodes required to achieve synchronization and Lyapunov stability. As a proof of principle, this fast-acting control method is demonstrated using a model of the German power transmission grid.

This work has been published in Physical Review E and is available as .

In the present study we consider a random network of Kuramoto oscillators with inertia in order to mimic and investigate the dynamics emerging in high-voltage power grids. The corresponding natural frequencies are assumed to be bimodally Gaussian distributed, thus modeling the distribution of both power generators and consumers: for the stable operation of power systems these two quantities must be in balance. Since synchronization has to be ensured for a perfectly working power grid, we investigate the stability of the desired synchronized state. We solve this problem numerically for a population of N rotators regardless of the level of quenched disorder present in the topology. We obtain stable and unstable solutions for different initial phase conditions, and we propose how to control unstable solutions, for sufficiently large coupling strength, such that they are stabilized for any initial phase. Finally, we examine a random Erdös-Renyi network under the impact of white Gaussian noise, which is an essential ingredient for power grids in view of increasing renewable energy sources.

This work has been published in Chaos: An Interdisciplinary Journal of Nonlinear Science and is available as .

Gamma rhythm (20-100Hz) plays a key role in numerous cognitive tasks: working memory, sensory processing and in routing of information across neural circuits. In comparison with lower frequency oscillations in the brain, gamma-rhythm associated firing of the individual neurons is sparse and the activity is locally distributed in the cortex. Such “weak” gamma rhythm results from synchronous firing of pyramidal neurons in an interplay with the local inhibitory interneurons in a “pyramidal-interneuron gamma” or PING. Experimental evidence shows that individual pyramidal neurons during such oscillations tend to fire at rates below gamma, with the population showing clear gamma oscillations and synchrony. One possible way to describe such features is that this gamma oscillation is generated within local synchronous neuronal clusters. The number of such synchronous clusters defines the overall coherence of the rhythm and its spatial structure. The number of clusters in turn depends on the properties of the synaptic coupling and the intrinsic properties of the constituent neurons. We previously showed that a slow spike frequency adaptation current in the pyramidal neurons can effectively control cluster numbers. These slow adaptation currents are modulated by endogenous brain neuromodulators such as dopamine, whose level is in turn related to cognitive task requirements. Hence we postulate that dopaminergic modulation can effectively control the clustering of weak gamma and its coherence. In this paper we study how dopaminergic modulation of the network and cell properties impacts the cluster formation process in a PING network model.

This work has been accepted for publication in Communications in Nonlinear Science and Numerical Simulation and is available as .

In this work, we propose a nonlinear stochastic model of a network of stochastic spiking neurons. We heuristically derive the mean-field limit of this system. We then design a Monte Carlo method for the simulation of the microscopic system, and a finite volume method (based on an upwind implicit scheme) for the mean-field model. The finite volume method respects numerical versions of the two main properties of the mean-field model, conservation and positivity, leading to existence and uniqueness of a numerical solution. As the size of the network tends to infinity, we numerically observe propagation of chaos and convergence from an individual description to a mean-field description. Numerical evidences for the existence of a Hopf bifurcation (synonym of synchronised activity) for a sufficiently high value of connectivity, are provided.

This work has been submitted for publication and is available as .

The model AM2b is conventionally represented by a system of differential equations. However, this model is valid only in a large population context and our objective is to establish several stochastic models at different scales. At a microscopic scale, we propose a pure jump stochastic model that can be simulated exactly. But in most situations this exact simulation is not feasible, and we propose approximate simulation methods of Poisson type and of diffusive type. The diffusive type simulation method can be seen as a discretization of a stochastic differential equation. Finally, we formally present a result of law of large numbers and of functional central limit theorem which demonstrates the convergence of these stochastic models towards the initial deterministic models.

Coupling among neural rhythms is one of the most important mechanisms at the basis of cognitive processes in the brain. In this study we consider a neural mass model, rigorously obtained from the microscopic dynamics of an inhibitory spiking network with exponential synapses, able to autonomously generate collective oscillations (COs). These oscillations emerge via a super-critical Hopf bifurcation, and their frequencies are controlled by the synaptic time scale, the synaptic coupling and the excitability of the neural population. Furthermore, we show that two inhibitory populations in a master-slave configuration with different synaptic time scales can display various collective dynamical regimes: namely, damped oscillations towards a stable focus, periodic and quasi-periodic oscillations, and chaos. Finally, when bidirectionally coupled the two inhibitory populations can exhibit different types of theta-gamma cross-frequency couplings (CFCs): namely, phase-phase and phase-amplitude CFC. The coupling between theta and gamma COs is enhanced in presence of a external theta forcing, reminiscent of the type of modulation induced in Hippocampal and Cortex circuits via optogenetic drive.

This work has been submitted for publication and is available as .

The conductance-based refractory density (CBRD) approach is a parsimonious mathematical-computational framework for modelling interacting populations of regular spiking neurons, which, however, has not been yet extended for a population of bursting neurons. The canonical CBRD method allows to describe the firing activity of a statistical ensemble of uncoupled Hodgkin-Huxley-like neurons (differentiated by noise) and has demonstrated its validity against experimental data. The present manuscript generalises the CBRD for a population of bursting neurons; however, in this pilot computational study, we consider the simplest setting in which each individual neuron is governed by a piecewise linear bursting dynamics. The resulting population model makes use of slow-fast analysis, which leads to a novel methodology that combines CBRD with the theory of multiple timescale dynamics. The main prospect is that it opens novel avenues for mathematical explorations, as well as, the derivation of more sophisticated population activity from Hodgkin-Huxley-like bursting neurons, which will allow to capture the activity of synchronised bursting activity in hyper-excitable brain states (e.g. onset of epilepsy).

This work has been published in Bulletin of Mathematical Biology and is available as .

We study the long time behavior of the solution to some McKean-Vlasov stochastic differential equation (SDE) driven by a Poisson process. In neuroscience, this SDE models the asymptotic dynamics of the membrane potential of a spiking neuron in a large network. We prove that for a small enough interaction parameter, any solution converges to the unique (in this case) invariant measure. To this aim, we first obtain global bounds on the jump rate and derive a Volterra type integral equation satisfied by this rate. We then replace temporary the interaction part of the equation by a deterministic external quantity (we call it the external current). For constant current, we obtain the convergence to the invariant measure. Using a perturbation method, we extend this result to more general external currents. Finally, we prove the result for the non-linear McKean-Vlasov equation.

This work has been published in Stochastic Processes and their Applications and is available as .

Low dimensional dynamics of large networks is the focus of many theoretical works, but controlled laboratory experiments are comparatively very few. Here, we discuss experimental observations on a mean-field coupled network of hundreds of semiconductor lasers, which collectively display effectively low-dimensional mixed mode oscillations and chaotic spiking typical of slow-fast systems. We demonstrate that such a reduced dimensionality originates from the slow-fast nature of the system and of the existence of a critical manifold of the network where most of the dynamics takes place. Experimental measurement of the bifurcation parameter for different network sizes corroborate the theory.

This work has been submitted for publication and is available as .

We study the asymptotic behaviour for asymmetric neuronal dynamics in a network of Hopfield neurons. The randomness in the network is modelled by random couplings which are centered Gaussian correlated random variables. We prove that the annealed law of the empirical measure satisfies a large deviation principle without any condition on time. We prove that the good rate function of this large deviation principle achieves its minimum value at a unique Gaussian measure which is not Markovian. This implies almost sure convergence of the empirical measure under the quenched law. We prove that the limit equations are expressed as an infinite countable set of linear non Markovian SDEs.

This work has been submitted for publication and is available as .

We study the asymptotic behaviour for asymmetric neuronal dynamics in a network of linear Hopfield neurons. The randomness in the network is modelled by random couplings which are centered i.i.d. random variables with finite moments of all orders. We prove that if the initial condition of the network is a set of i.i.d random variables with finite moments of all orders and independent of the synaptic weights, each component of the limit system is described as the sum of the corresponding coordinate of the initial condition with a centered Gaussian process whose covariance function can be described in terms of a modified Bessel function. This process is not Markovian. The convergence is in law almost surely w.r.t. the random weights. Our method is essentially based on the CLT and the method of moments.

This work has been submitted for publication and is available as .

Consider a large number

This work has been accepted for publication in Annales de l'Institut Henri Poincaré (B) Probabilités et Statistiques and is available as .

We study localized patterns in an exact mean-field description of a spatially-extended network of quadratic integrate-and-fire (QIF) neurons. We investigate conditions for the existence and stability of localized solutions, so-called bumps, and give an analytic estimate for the parameter range where these solutions exist in parameter space, when one or more microscopic network parameters are varied. We develop Galerkin methods for the model equations, which enable numerical bifurcation analysis of stationary and time-periodic spatially-extended solutions. We study the emergence of patterns composed of multiple bumps, which are arranged in a snake-and-ladder bifurcation structure if a homogeneous or heterogeneous synaptic kernel is suitably chosen. Furthermore, we examine time-periodic, spatially-localized solutions (oscillons) in the presence of external forcing, and in autonomous, recurrently coupled excitatory and inhibitory networks. In both cases we observe period doubling cascades leading to chaotic oscillations.

This work has been submitted for publication and is available as .

In the context of the Human Brain Project (HBP, see section 5.1.1.1. below), we have recruited Emre Baspinar in December 2018 for a two-year postdoc. Within MathNeuro, Emre is working on analysing slow-fast dynamical behaviours in the mean-field limit of neural networks.

In a first project, he has been analysing the slow-fast structure in the mean-field limit of a network of FitzHugh-Nagumo neuron models; the mean-field was previously established in but its slow-fast aspect had not been analysed. In particular, he has proved a persistence result of Fenichel type for slow manifolds in this mean-field limit, thus extending previous work by Berglund *et al.* , . A manuscript is in preparation.

In a second project, he has been looking at a network of Wilson-Cowan systems whose mean-field limit is an ODE, and he has studied elliptic bursting dynamics in both the network and the limit: its slow-fast dissection, its singular limits and the role of canards. In passing, he has obtained a new characterisation of ellipting bursting via the construction of periodic limit sets using both the slow and the fast singular limits and unravelled a new singular-limit scenario giving rise to elliptic bursting via a new type of torus canard orbits. A manuscript is in preparation.

Neural field models are commonly used to describe wave propagation and bump attractors at a tissue level in the brain. Although motivated by biology, these models are phenomenological in nature. They are built on the assumption that the neural tissue operates in a near synchronous regime, and hence, cannot account for changes in the underlying synchrony of patterns. It is customary to use spiking neural network models when examining within population synchronization. Unfortunately, these high-dimensional models are notoriously hard to obtain
insight from. In this paper, we consider a network of

This work has been published in Physical Review E and is available as .

The modelling of neural fields in the visual cortex involves geometrical structures which describe in mathematical formalism the functional architecture of this cortical area. The case of contour detection and orientation tuning has been extensively studied and has become a paradigm for the mathematical analysis of image processing by the brain. Ten years ago an attempt was made to extend these models by replacing orientation (an angle) with a second-order tensor built from the gradient of the image intensity and named the structure tensor. This assumption does not follow from biological observations (experimental evidence is still lacking) but from the idea that the effectiveness of texture processing with the stucture tensor in computer vision may well be exploited by the brain itself. The drawback is that in this case the geometry is not Euclidean but hyperbolic instead, which complicates substantially the analysis. The purpose of this review is to present the methodology that was developed in a series of papers to investigate this quite unusual problem, specifically from the point of view of tuning and pattern formation. These methods, which rely on bifurcation theory with symmetry in the hyperbolic context, might be of interest for the modelling of other features such as color vision, or other brain functions.

This work has been accepted for publication in Journal of Mathematical Neuroscience and is available as .

We address the question of color-space interactions in the brain, by proposing a neural field model of color perception with spatial context for the visual area V1 of the cortex. Our framework reconciles two opposing perceptual phenomena, known as simultaneous contrast and chromatic assimilation. They have been previously shown to act synergistically, so that at some point in an image, the color seems perceptually more similar to that of adjacent neighbors, while being more dissimilar from that of remote ones. Thus, their combined effects are enhanced in the presence of a spatial pattern, and can be measured as larger shifts in color matching experiments. Our model supposes a hypercolumnar structure coding for colors in V1, and relies on the notion of color opponency introduced by Hering. The connectivity kernel of the neural field exploits the balance between attraction and repulsion in color and physical spaces, so as to reproduce the sign reversal in the influence of neighboring points. The color sensation at a point, defined from a steady state of the neural activities, is then extracted as a nonlinear percept conveyed by an assembly of neurons. It connects the cortical and perceptual levels, because we describe the search for a color match in asymmetric matching experiments as a mathematical projection on color sensations. We validate our color neural field alongside this color matching framework, by performing a multi-parameter regression to data produced by psychophysicists and ourselves. All the results show that we are able to explain the nonlinear behavior of shifts observed along one or two dimensions in color space, which cannot be done using a simple linear model.

This work has been published in PLoS Computational Biology and is available as .

We present a rigorous framework for the local analysis of canards and slow passages through bifurcations in a wide class of infinite-dimensional dynamical systems with time-scale separation. The framework is applicable to models where an infinite-dimensional dynamical system for the fast variables is coupled to a finite-dimensional dynamical system for slow variables. We prove the existence of centre-manifolds for generic models of this type, and study the reduced, finite-dimensional dynamics near bifurcations of (possibly) patterned steady states in the layer problem. Theoretical results are complemented with detailed examples and numerical simulations covering systems of local- and nonlocal-reaction diffusion equations, neural field models, and delay-differential equations. We provide analytical foundations for numerical observations recently reported in literature, such as spatio-temporal canards and slow-passages through Hopf bifurcations in spatially-extended systems subject to slow parameter variations. We also provide a theoretical analysis of slow passage through a Turing bifurcation in local and nonlocal models.

This work has been submitted for publication and is available as .

Inner hair cells (IHCs) are excitable sensory cells in the inner ear that encode acoustic information. Before the onset of hearing IHCs fire calcium-based action potentials that trigger transmitter release onto developing spiral ganglion neurones. There is accumulating experimental evidence that these spontaneous firing patterns are associated with maturation of the IHC synapses and hence involved in the development of hearing. The dynamics organising the IHCs' electrical activity are therefore of interest.

Building on our previous modelling work we propose a three-dimensional, reduced IHC model and carry out non-dimensionalisation. We show that there is a significant range of parameter values for which the dynamics of the reduced (three-dimensional) model map well onto the dynamics observed in the original biophysical (four-dimensional) IHC model. By estimating the typical time scales of the variables in the reduced IHC model we demonstrate that this model could be characterised by two fast and one slow or one fast and two slow variables depending on biophysically relevant parameters that control the dynamics. Specifically, we investigate how changes in the conductance of the voltage-gated calcium channels as well as the parameter corresponding to the fraction of free cytosolic calcium concentration in the model affect the oscillatory model bahaviour leading to transition from pseudo-plateau bursting to mixed-mode oscillations. Hence, using fast-slow analysis we are able to further our understanding of this model and reveal a path in the parameter space connecting pseudo-plateau bursting and mixed-mode oscillations by varying a single parameter in the model.

This work has been accepted for publication in Communications in Nonlinear Science and Numerical Simulation and is available as .

A minimal system for parabolic bursting, whose associated slow flow is integrable, is presented and studied both from the viewpoint of bifurcation theory of slow-fast systems, of the qualitative analysis of its phase portrait and of numerical simulations. We focus the analysis on the spike-adding phenomenon. After a reduction to a periodically forced one-dimensional system, we uncover the link with the dips and slices first discussed by J.E. Littlewood in his famous articles on the periodically forced van der Pol system.

This work has been published in Mathematical Modelling of Natural Phenomena and is available as .

Neurons can anticipate incoming signals by exploiting a physiological mechanism that is not well understood. This article offers a novel explanation on how a receiver neuron can predict the sender's dynamics in a unidirectionally-coupled configuration, in which both sender and receiver follow the evolution of a multi-scale excitable system. We present a novel theoretical viewpoint based on a mathematical object, called *canard*, to explain anticipation in excitable systems. We provide a numerical approach, which allows to determine the transient effects of canards. To demonstrate the general validity of canard-mediated anticipation in the context of excitable systems, we illustrate our framework in two examples, a multi-scale radio-wave circuit (the van der Pol model) that inspired a caricature neuronal model (the FitzHugh-Nagumo model) and a biophysical neuronal model (a 2-dimensional reduction of the Hodgkin-Huxley model), where canards act as messengers to the senders' prediction. We also propose an experimental paradigm that would enable experimental neuroscientists to validate our predictions. We conclude with an outlook to possible fascinating research avenues to further unfold the mechanisms underpinning anticipation. We envisage that our approach can be employed by a wider class of excitable systems with appropriate theoretical extensions.

This work has been published in Chaos: An Interdisciplinary Journal of Nonlinear Science and is available as .

In this work we have revisited a rate model that accounts for the spontaneous activity in the developing spinal cord of the chicken embryo . The dynamics is that of a classical square-wave burster, with alternation of silent and active phases. Tabak et al. have proposed two different three-dimensional (3D) models with variables representing average population activity, fast activity-dependent synaptic depression and slow activity-dependent depression of two forms. In , , various 3D combinations of these four variables have been studied further to reproduce rough experimental observations of spontaneous rhythmic activity. In this work, we have first shown the spike-adding mechanism via canards in one of these 3D models from where the fourth variable was treated as a control parameter. Then we discussed how a canard-mediated slow passage in the 4D model explains the sub-threshold oscillatory behavior which cannot be reproduced by any of the 3D models, giving rise to mixed-mode bursting oscillations (MMBOs); see . Finally, we related the canard-mediated slow passage to the intervals of burst and silent phase which have been linked to the blockade of glutamatergic or GABAergic/glycinergic synapses over a wide range of developmental stages .

This work has been submitted for publication and is available as .

Cortical spreading depression (CSD) is a wave of transient intense neuronal firing leading to a long lasting depolarizing block of neuronal activity. It is a proposed pathological mechanism of migraine with aura. Some forms of migraine are associated with a genetic mutation of the Na

This work has been published in Journal of Computational Neuroscience and is available as .

The extension of this work is the topic of the PhD of Louisiane Lemaire, who started in October 2018. A first part of Louisiane's PhD has been to improve and extend the model published in in a number of ways: replace the GABAergic neuron model used in , namely the Wang-Buszáki model, by a more recent fast-spiking cortical interneuron model due to Golomb and collaborators; implement the effect of the HM1a toxin used by M. Mantegazza to mimic the genetic mutation of sodium channels responsible for the hyperactivity of the GABAergic neurons; take into account ionic concentration dynamics (relaxing the hypothesis of constant reversal potentials) for the GABAergic as well whereas in this was done only for the Pyramidal neuron. This required a great deal of modelling and calibration and the simulation results are closer to the actual experiments by Mantegazza than in our previous study. A manuscript is in preparation.

We provide a uniqueness result for a class of viscosity solutions to sub-Riemannian mean curvature flows. In a sub-Riemannian setting, uniqueness cannot be deduced by the comparison principle, which is known only for graphs and for radially symmetry surfaces. Here we use a definition of continuous viscosity solutions of sub-Riemannian mean curvature flows motivated from a regularized Riemannian approximation of the flow. With this definition, we prove that any continuous viscosity solution of the equation is a limit of a sequence of solutions of Riemannian flow and obtain as a consequence uniqueness and the comparison principle. The results are provided in the settings of both 3-dimensional rototranslation group

This work has been published in SIAM Journal on Mathematical Analysis and is available as .

In this paper we present a novel model of the primary visual cortex (V1) based on orientation, frequency and phase selective behavior of the V1 simple cells. We start from the first level mechanisms of visual perception: receptive profiles. The model interprets V1 as a fiber bundle over the 2-dimensional retinal plane by introducing orientation, frequency and phase as intrinsic variables. Each receptive profile on the fiber is mathematically interpreted as a rotated, frequency modulated and phase shifted Gabor function. We start from the Gabor function and show that it induces in a natural way the model geometry and the associated horizontal connectivity modeling the neural connectivity patterns in V1. We provide an image enhancement algorithm employing the model framework. The algorithm is capable of exploiting not only orientation but also frequency and phase information existing intrinsically in a 2-dimensional input image. We provide the experimental results corresponding to the enhancement algorithm.

This work has been submitted for publication and is available as .

Title: The Human Brain Project

Program: FP7

Duration: October 2013 - March 2016 (first part), then : April 2016 - March 2018 (second part) and then : April 2018 - March 2020 (third part)

Coordinator: EPFL

Partners:

see the webpage of the project.

Olivier Faugeras is leading the task T4.1.3 entitled “Meanfield and population models” of the Worpackage W4.1 “Bridging Scales”.

Inria contact: Olivier Faugeras (first part) and then : Romain Veltz (second and third part)

Understanding the human brain is one of the greatest challenges facing 21st century science. If we can rise to the challenge, we can gain profound insights into what makes us human, develop new treatments for brain diseases and build revolutionary new computing technologies. Today, for the first time, modern ICT has brought these goals within sight. The goal of the Human Brain Project, part of the FET Flagship Programme, is to translate this vision into reality, using ICT as a catalyst for a global collaborative effort to understand the human brain and its diseases and ultimately to emulate its computational capabilities. The Human Brain Project will last ten years and will consist of a ramp-up phase (from month 1 to month 36) and subsequent operational phases.

This Grant Agreement covers the ramp-up phase. During this phase the strategic goals of the project will be to design, develop and deploy the first versions of six ICT platforms dedicated to Neuroinformatics, Brain Simulation, High Performance Computing, Medical Informatics, Neuromorphic Computing and Neurorobotics, and create a user community of research groups from within and outside the HBP, set up a European Institute for Theoretical Neuroscience, complete a set of pilot projects providing a first demonstration of the scientific value of the platforms and the Institute, develop the scientific and technological capabilities required by future versions of the platforms, implement a policy of Responsible Innovation, and a programme of transdisciplinary education, and develop a framework for collaboration that links the partners under strong scientific leadership and professional project management, providing a coherent European approach and ensuring effective alignment of regional, national and European research and programmes. The project work plan is organized in the form of thirteen subprojects, each dedicated to a specific area of activity.

A significant part of the budget will be used for competitive calls to complement the collective skills of the Consortium with additional expertise.

Title: NeuroTransmitter cycle: A Slow-Fast modeling approach

PI for Inria MathNeuro: Mathieu Desroches

International Partner (Institution - Laboratory - Researcher):

Basque Center for Applied Mathematics (BCAM) (Spain) - Mathematical, Computational and Experimental Neuroscience (MCEN) Team - Serafim Rodrigues

Start year: 2019

See also: https://

This associated team project proposes to deepen the links between two young research groups, on strong Neuroscience thematics. This project aims to start from a joint work in which we could successfully model synaptic transmission delays for both excitatory and inhibitory synapses, matching experimental data, and to supplant it in two distinct directions. On the one hand, by modeling the endocytosis so as to obtain a complete mathematical formulation of the presynaptic neurotransmitter cycle, which will then be integrated within diverse neuron models (in particular interneurons) hence allowing a refined analysis of their excitability and short-term plasticity properties. On the other hand, by modeling the postsynaptic neurotransmitter cycle in link with long-term plasticity and memory. We will incorporate these new models of synapse in different types of neuronal networks and we will then study their excitability, plasticity and synchronisation properties in comparison with classical models. This project will benefit from strong experimental collaborations (UCL, Alicante) and it is coupled to the study of brain pathologies linked with synaptic dysfunctions, in particular certain early signs of Alzheimer's Disease. Our initiative also contains a training aspect with two PhD student involved as well as a series of mini-courses which we will propose to the partner institute on this research topic; we will also organise a "wrap-up" workshop in Sophia at the end of it. Finally, the project is embedded within a strategic tightening of our links with Spain with the objective of pushing towards the creation of a Southern-Europe network for Mathematical, Computational and Experimental Neuroscience, which will serve as a stepping stone in order to extend our influence beyond Europe.

VU Amsterdam (Netherlands), Faculty of Science, Mathematics: Daniele Avitabile

ENS Paris, Laboratoire de Neurosciences Cognitives: Boris Gutkin

University of the Balearic Islands (Spain), Dept of Applied Mathematics: Antonio Teruel

Polytechnic University of Catalunya (Spain), Dept of Applied Mathematics: Antoni Guillamon

Invitation of Nikola Popovic, University of Edinburgh (UK), April 2019

Invitation of Tomás Lázaro, Polytechnic University of Catalunya (Spain), May 2019

Ariane Delrocq (étudiante Ecole Polytechnique, Paris): April - July 2019

Visit of Yuri Rodrigues and Romain Veltz to Cian O'Donnell (University of Bristol, UK) in December 2019

One-month research stay of Mathieu Desroches at BCAM (Bilbao, Spain) on an invited professor scholarship to work with Serafim Rodrigues, June-July 2019

Mathieu Desroches was one of the Program Chairs of the Waves Côte d'Azur conference, held in Nice, 4-7 June, 2019.

Mathieu Desroches was on the Scientific Committee of the Waves Côte d'Azur conference, held in Nice, 4-7 June, 2019.

Olivier Faugeras and Romain Veltz were on the Advisory Board of the 5th International Conference on Mathematical Neuroscience, held in Copenhagen (Denmark), June 24 - 26, 2019.

Olivier Faugeras is the co-editor in chief of the open access Journal of Mathematical Neuroscience. This journal has a 2-year Impact Factor of 2.091.

Mathieu Desroches was Guest Editor of a Special Issue on “Excitable Dynamics in Neural and Cardiac Systems” of the journal Communications in Nonlinear Science and Numerical Simulation. This journal has 5-year Impact Factor of 3.637.

Fabien Campillo acts as a reviewer for Journal of Mathematical Biology.

Mathieu Desroches acts as a reviewer for Physica D, SIAM Journal on Applied Dynamical Systems (SIADS), PLoS Computational Biology, Chaos: An International Journal of Nonlinear Science, Journal of Mathematical Biology, Journal of Neurophysiology, Journal of Mathematical Neuroscience, Nonlinear Dynamics.

Olivier Faugeras acts as a reviewer for the Journal of Mathematical Neuroscience, the Journal of Computational Neuroscience, the SIAM Journal on Applied Dynamical Systems (SIADS).

Martin Krupa acts as a reviewer for Nonlinearity, Proceedings of the National Academy of Sciences of the USA (PNAS), the SIAM Journal of Applied Dynamical Systems (SIADS).

Romain Veltz acts as a reviewer for Neural Computation, eLIFE, SIADS, PNAS, Journal of the Royal Society Interface, Plos Computational Biology and Acta Applicandae Mathematicae.

Emre Baspinar, “A geometric model with frequency-phase and its application to image enhancemen”, conference “Shape Analysis in Biology”, Sorbonne Université, Paris, November 2019

Emre Baspinar, “A sub-Riemannian cortical model with frequency-phase and its application to orientation map construction.”, 1st meeting of the NeuroMod Institute, Fréjus (France), July 2019.

Emre Baspinar, “A sub-Riemannian model of the visual cortex based on frequency-phase and its applications” [poster], 5th International Conference on Mathematical Neuroscience, Copenhagen (Denmark), June 2019.

Emre Baspinar, “A sub-Riemannian cortical model with frequency-phase and its application to orientation map construction.”, Waves Côte d'Azur Conference, Nice (France), June 2019.

Mathieu Desroches, “Canards in excitatory networks”, XXXIXth Dynamics Days Europe Conference, Rostock (Germany), September 2019.

Mathieu Desroches, “Canards and spike-adding in neural bursters”, International Congress on Industrial and Applied Mathematics (ICIAM), Valencia (Spain), July 2019.

Mathieu Desroches, “Canards in excitatory networks”, SIAM Conference on Application of Dynamical Systems, Snowbird (USA), May 2019.

Mathieu Desroches, “Slow-fast analysis of bursting oscillations: old and new”, Bilateral International Meeting UK-France, Royal Society, Chicheley Hall, Milton Keynes (UK), February 2019.

Olivier Faugeras, “The meanfield limit of a network of Hopfield neurons with correlated synaptic weights”, Bilateral International Meeting UK-France, Royal Society, Chicheley Hall, Milton Keynes (UK), February 2019.

Louisiane Lemaire, “Modeling the initiation of cortical spreading depression triggered by the hyperactivity of GABAergic neurons” [poster], 1st meeting of the NeuroMod Institute, Fréjus (France), July 2019.

Louisiane Lemaire, “Modeling the initiation of cortical spreading depression triggered by the hyperactivity of GABAergic neurons” [poster], 5th International Conference on Mathematical Neuroscience, Copenhagen (Denmark), June 2019.

Martin Krupa, “Neuronal mechanisms for sequential activation of memory items”, International Congress on Industrial and Applied Mathematics (ICIAM), Valencia (Spain), July 2019.

Martin Krupa, “Modeling cortical spreading depression induced by the hyperactivity of interneurons”, Waves Côte d'Azur Conference, Nice (France), June 2019.

Simona Olmi, “Cross frequency coupling in next generation inhibitory neural mass models", Eugene Wigner Colloquium (Physics Colloquium), TU Berlin, Berlin (Germany), December 2019.

Simona Olmi, “Cross frequency coupling in next generation inhibitory neural mass models, XXXIXth Dynamics Days Europe Conference, Rostock (Germany), September 2019.

Simona Olmi, “Influence of network topology on spreading of epileptic seizure”, Waves Côte d'Azur Conference, Nice (France), June 2019.

Simona Olmi, “Influence of network topology on spreading of epileptic seizure", XXIV Convegno Nazionale di Fisica Statistica e dei Sistemi Complessi, Parma (Italy), June 2019.

Simona Olmi, “Enhancing power grid synchronization and stability through time delayed feedback control”, EPS conference “Statistical Physics of Complex Systems”, Stockholm (Sweden), May 2019.

Simona Olmi, “The Kuramoto model with inertia: from fireflies to power grids”, School and Workshop on Patterns of Synchrony: Chimera States and Beyond, Trieste (Italy), May 2019.

Yuri Rodrigues, “: A stochastic model of postsynaptic plasticity based on dendritic spine Ca

Yuri Rodrigues, “Towards a stochastic model of excitatory synapse”, 1st meeting of the NeuroMod Institute, Fréjus (France), July 2019.

Yuri Rodrigues, “A stochastic model of postsynaptic plasticity based on dendritic spine Ca

Halgurd Taher, “Heuristic mean-field model for short term synaptic plasticity” [poster], XXXIXth Dynamics Days Europe Conference, Rostock (Germany), September 2019.

Halgurd Taher, “Enhancing power grid synchronization and stability through time delayed feedback control”, XXXIXth Dynamics Days Europe Conference, Rostock (Germany), September 2019.

Halgurd Taher, “Mean-field model for short-term synaptic plasticity” [poster], 1st meeting of the NeuroMod Institute, Fréjus (France), July 2019.

Romain Veltz, “Dynamics of a mean field limit of interacting 2D nonlinear stochastic spiking neurons”, 5th International Conference on Mathematical Neuroscience, Copenhagen (Denmark), June 2019.

Romain Veltz, “Analysis of a mean field of 2d spiking neurons, theory and numerics”, Bilateral International Meeting UK-France, Royal Society, Chicheley Hall, Milton Keynes (UK), February 2019.

Fabien Campillo was member of the local committee in charge of the scientific selection of visiting scientists (Comité NICE)

Mathieu Desroches was on the Advisory Board of the Complex Systems Academy of the UCA

Olivier Faugeras was the President of the study group “Intelligence artificielle” of the *Académie des Sciences de Paris*. As such, he led the audition of experts of this research field, namely for 2019, Jean Ponce, Stéphane Mallat and Francis Bach. This study group has also produced a report for the 2019 G7 meeting.

Master / Doctorat: Fabien Campillo, Introduction to Piecewise Deterministic Markov Processes and applications to Neuroscience, 10 hours, Basque Center for Applied Mathematics (BCAM), Bilbao, Spain.

Master : Mathieu Desroches, Modèles Mathématiques et Computationnels en Neuroscience (Lectures, example classes and computer labs), 30 hours, M1 (BIM), Sorbonne Université, Paris, France.

Master / Doctorat: Mathieu Desroches, Slow-fast dynamics in bursting neurons, Tutorial of 3 hours, ICMNS Conference, June 2019.

Master: Romain Veltz, Mathematical Methods for Neurosciences, 20 hours, M2 (MVA), Sorbonne Université, Paris, France.

PhD in progress : Louisiane Lemaire, “Multi-scale mathematical modeling of cortical spreading depression”, started in October 2018, co-supervised by Mathieu Desroches and Martin Krupa.

PhD in progress: Yuri Rodrigues, “Towards a model of post synaptic excitatory synapse”, started in March 2018, co-supervised by Romain Veltz and Hélène Marie (IPMC, Sophia Antipolis).

PhD in progress: Halgurd Taher, “Next generation neural-mass models”, started in November 2018, co-supervised by Simona Olmi and Mathieu Desroches.

PhD in progress: Quentin Cormier, “Biological spiking neural networks", started in September 2017, co-supervised by Romain Veltz and Etienne Tanré (Inria TOSCA).

PhD in progress: Pascal Helson, “Study of plasticity laws with stochastic processes”, started in September 2016, co-supervised by Romain Veltz and Etienne Tanré (Inria TOSCA).

PhD in progress: Samuel Nyobe, “Inférence dans les modèles de Markov cachés : Application en foresterie”, started in October 2017, co-supervised by Fabien Campillo, Serge Moto (University of Yaoundé, Camerun) and Vivien Rossi (CIRAD).

Fabien Campillo was reviewer and member of the jury of the PhD of Nicolas Thomas (École doctorale des sciences mathématiques de Paris centre (ED 386)), thesis of Sorbonne Université entitled “Stochastic numerical methods for Piecewise Deterministic Markov Processes. Applications in Neuroscience”, 20 June 2019.

Mathieu Desroches was examiner and member of the Jury of the HDR of Arnaud Tonnelier (Inria Tripop, Grenoble) entitled “Piecewise linear dynamical systems and excitability”, Inria Rhône Alpes, 20 June 2019.

Mathieu Desroches was reviewer and member of the Jury of the PhD of Marina Esteban (University of Seville, Spain) entitled “Dynamics and Bifurcations of Nonlinear Systems with Hysteresis”, University of Seville (Spain), 4 March 2019.

Mathieu Desroches was member of the Jury of the PhD of Gabriela Capo Rangel (Basque Center for Applied Mathematics, Bilbao) entitled “Computational predictive modeling of integrated cerebral metabolism, electrophysiology and hemodynamics”, University of the Basque Country (Leioa, Spain), 12 February 2019.