## Section: New Results

### Reservoir Computing

Participants : Hélène Paugam-Moisy, Nicolas Bredèche, Alexandre Devert, Fei Jiang, Cédric Hartland, Miguel Nicolau, Marc Schoenauer, Michèle Sebag.

Reservoir Computing is concerned with large Neural Networks where only some macro-parameters of the topology and the weights are specified, and the actual network and weights are generated randomly, within the constraints specified by the macro-characteristics. The work on Reservoir Computing, or more precisely for 2008 at TAO, Echo State Networks (ESNs), is mainly concerned with Fei Jiang's PhD work and Miguel Nicolau's post-doc work within the GENNETEC European project. However, this on-going work on Neural Network topologies is presently being boosted by the arrival in September 2008 of Hélène Paugam-Moisy, in “délégation” at INRIA from Université Lyon 2 – hence the separation of this SIG from the Complex System SIG where this work originally belonged to.

#### Echo State Networks

Fei Jiang's PhD is dealing with the optimization of the topology of large Neural Networks, co-supervised by Hugues Berry (EPI Alchemy) and Marc Schoenauer. His work in 2008 have turned toward ESNs, and have resulted in a paper in PPSN conference [23] where he investigates the Evolutionary Reinforcement Learning of ESNs using the CMA-ES algorithm to optimize the out-going weights: in Reinforcement Learning context, the optimization problem is not quadratic any more. Note that this work was first published as a poster in GECCO [24] . On-going work is concerned with identifying the reasons why some ESNs perform significantly better than others, even though they have been generated with the same macro-caracteristics (density of connections, range of internal weights). The goal is to identify other descriptors of the topology that would explain those differences.

Several other on going researches in TAO are also indirectly concerned with ESNs, that they use, together with CMA-ES for learning the weights, as a basic tool for their work. Cédric Hartland uses ESNs as robot controllers, studying their memory capacities (paper submitted). Alexandre Devert uses ESNs as basic controllers for his Continuous Cellular Automata approaches [14] (PhD to be defended in early January 2009).

#### Genetic Regulatory Network models

Within the GENNETEC project, TAO is studying Genetic Regulatory Networks (GRNs) as possible models for network generation. A first work demonstrated that networks designed using Banzhaf's model for GRNs
(W. Banzhaf, Artificial Regulatory Networks and Genetic Programming, in R. Riolo, Ed.,
*Genetic Programming Theory and Practice 2003* , pp 43–62, Kluwer)are far more evolvable than randomly-generated networks when it comes to evolve them toward some specific topology properties, be they scale-free properties
[25] (a journal paper is also submitted) or small-world properties
[44] . Further work will check whether such generative process can be used to optimize the computational properties of ESNs, as described above, and whether a bridge can be
built toward Genetic Programming, as initially advocated by W. Banzhaf.