Section: New Results
Inverse problems
Participants : Nicolas Bredèche, Antoine Cornuéjols, Alexandre Devert, Mohamed Jebalia, Matthieu Pierres, Marc Schoenauer, Michèle Sebag, Vijay Pratap Singh.
Abstract:
Inverse Problems (IP) aim to determine unknown causes based on the observation of their effects. In contrast, direct problems are concerned with computing the effects of (exhaustively described) causes. Inverse problems are often mathematically illposed in the sense that the existence, uniqueness and stability of solutions cannot be assured.
IPs are present in many areas of science and engineering, such as mechanical engineering, meteorology, heat transfer, electromagnetism, material science, etc. The TAO project has focused on the problems of system identification, modeling physical (mechanical, chemical, biological, etc.) phenomena from available observations and current theories.
The key issue in Inverse Problem is the choice of the search space, i.e., in Evolutionary Computation terminology, of the representation.
Topological Optimum Design
Several results have been obtained by TAO team members (Marc Schoenauer and his former PhD student Hatem Hamda – PhD defended in 2003) regarding the Topological Optimum Design of Mechanical Structures using the socalled Voronoi representations for structures, that overcome many limitations of the more widelyused bitarray representation (H. Hamda, F. Jouve, E. Lutton, M. Schoenauer and M. Sebag. Compact Unstructured Representations in Evolutionary Topological Optimum Design. Applied Intelligence , 16, pp 139–155, 2002.). In collaboration with EZCT architecture, we have worked on the elementary benchmark of architectural schools, the design of a chair. Results of this work have been exposed in the Innovative Design Techniques section of the ArchiLab exhibition, an architectural exhibition in Orléans in Fall 2004, and published in an architectural journal [9] . This cooperation is now being officialized through a research contract with ECZT , and Alexandre Devert, who just started his PhD under the supervision of Marc Schoenauer and Nicolas Bredèche, started to work on the automatic building of construction plans during his Master 2 [48] : compared to previous work, this original approach ensures the contructibility of the resulting structure. The continuation of this work has been acceptetd for publication in EuroGP'06 [22] , and the work will now focus on technologically vaible designs, with as midterm objective the smallscale industrialisation through EZCT collaboration.
On the other hand, Mohamed Jebalia continues his PhD work under Marc Schoenauer's supervision, to investigate other representations based on Genetic Programming for the same problem, trying to grasp the holy grail of modularity in the representation. A good example is such modularity is given by crane structures, where the same elements are duplicated to build the complete crane  whereas an evolutionary algorithm must ``discover'' that structure again and again.
Geologically sound representation for seismic inversion
Vijay Pratap Singh's PhD thesis in geophysics, funded by IFP : Représentations géophysiquement fondées pour la reconstruction du profil de vitesses du soussol par algorithmes évolutionnaires , is a very good example of why domain knowledge must be included in the representation itself  and how it can be. This inverse problem of geophysics aims at identifying some underground characteristics from recorded seismic data. A previous PhD (F. Mansanné, Université de Pau, 2000) had used the same Voronoi representation proposed for Structural Design problem, obtaining some satisfactory results ...but many other results, though actually optimizing the target Least Square objective function, were geophysically absurd (as any 7yearsold geophysicist would have immediately noticed). The proposed representation now evolves an initial state of layered underground as well as the geological conditions across the geological ages and tries to fit the resulting underground to the seismic data. Such representation at least ensures the geophysical relevance of the identified underground. First results, on the purely geological problem, demonstrated the power of this approach [41] , [40] . Ongoing work is concerned with the complete geophysical problem.
Representations for isotherm law in chromatography
In the framework of the ACI NIM Nouvelles Interfaces des Mathématiques , Marc Schoenauer is part of the Chromalgema project, whose aim is the identification of the isotherm function in analytical chromatography. This is an inverse problem for which the direct problem is solved by standard numerical approaches (e.g. Godunov scheme for NonLinear Hyperbolic Systems).
When the unknown isotherm function is sought as a rational fraction of the concentrations (e.g. in the socalled ``Langmuir'' model), the inverse problem amounts to parametric optimization. It can then be solved by Evolution Strategies, coupled with standard deterministic gradientbased methods. A recent improvement has been brought by replacing the nowstandard ``selfadaptive'' ES by the recent ``CMAES'' (see section 6.2.1 ).
But a more complex situation is when no known model is available for the isotherm function. Two innovative representations have been proposed in that case:

Evolving a set of virtual examples: From those examples, the isotherm is computed by a Support Vector Machine algorithm. This representation allows one to take into account actual measurements made by the chemist engineers, including them as fixed data in the set of example (Another representation allowing the user to specify known points for the target law has been implemented in collaboration with Carlos Kavka, in the framework of fuzzy controllers for robots – see Evolutionary Robotics section.). Moreover, using SVM should scale up for a large number of components.

Using Genetic Programming: Mohamed Jebalia is now working on using GP to represent the unknown isotherm function, trying to identify both the degree of the model and its parameters.
Variable selection in chemical engineering
Anne Auger and Marc Schoenauer are also member of another project in the ACI NIM framework, Contrôle quantique, headed by Claude LeBris (Cermics, ENPC). Anne Auger addressed the optimization (using Evolution Strategies) of the laser characteristics to better align the molecules in order to control a chemical reaction. But one of the key issues in quantum control is that only a few among the many variables are actually useful to control. Ongoing work is concerned with the identification of those variables – and hence makes the bridge with what is called in Machine Learning field Feature Selection . The particular context of optimization should allow to use mixed techniques pertaining to both Evolutionary Computation (e.g. monitor the standard deviations associated with each variable to detect the ``good'' ones) and Machine Leaning (e.g. ``learn'' to discriminate between the good and the bad points, and use Data Mining Feature Selection to eliminate nondiscriminant variables).
The first part of this work program has now been tackled by Mikhail Zaslavskiy during his Stage d'Option de l'École Polytechnique , who proposed several ways to identify useful variables, both ex abrupto and within the evolutionary optimization process (CMSES in this case) [53] .
Automatic design of mesh topologies
Another closely related Inverse Problem is known as Feature Construction, mapping the problem at hand onto a space more amenable to the resolution of the problem. The Feature Construction problem, subsuming the Feature Selection one, is central to Artificial Intelligence in general, and Machine Learning and Data Mining in particular.
The problem of Feature Construction is studied on an application, concerned with characterizing good meshes for numerical engineering, particularly the design of 3D meshes in the aerospace industry (Ph.D. Mathieu Pierres, Airbus CIFRE, coadvised by Marc Schoenauer and Michèle Sebag). This challenging realworld application involves relational issues (a mesh is but a set of finite elements and their relations) and probabilistic issues (as usual for realworld applications, the solution is to be sought as a tradeoff between conflicting logical rules).
In order to characterize ``what is a good mesh'', a representation had to be designed, that allows one to represent efficiently for the subsequent learning task very different meshes – in particular, meshes with very different number of nodes, edges, surfaces and elements. This has been achieved by going at the level of a block (or element). Blocks are all described by a fixed set of features, which enables to use any learning algorithm to discriminate good blocks from bad blocks ...provided you have examples of bad blocks. Hence a first task was to build ``plausible'' bad blocks. Then standard learning algorithms can be applied to characterize good blocks. A good mesh is then a mesh which contains only good blocks  of whose blocks are as good as possible. Next step will be to start generating topologies automatically. Fist, a parametrized script will be used, and only its parameters will be optimized. Later, the complete topology will be designed by the Evolutionary Algorithm ,using a variable length representation – the series of ``cuts'' that are to be made to generate a complete topology.