Albert Benveniste has been elected IFAC
Fellow

Hycomes has been created as a new team of the Rennes - Bretagne
Atlantique Inria research center in July 2013. The team builds upon
the most promising results of the
S4

Systems industries today make extensive use of mathematical modeling tools to design computer controlled physical systems. This class of tools addresses the modeling of physical systems with models that are simpler than usual scientific computing problems by using only Ordinary Differential Equations (ODE) and Difference Equations but not Partial Differential Equations (PDE). This family of tools first emerged in the 1980's with SystemBuild by MatrixX (now distributed by National Instruments) followed soon by Simulink by Mathworks, with an impressive subsequent development.

In the early 90's control scientists from the University of Lund
(Sweden) realized that the above approach did not support component
based modeling of physical systems with
reuse

Despite these tools are now widely used by a number of engineers, they raise a number of technical difficulties. The meaning of some programs, their mathematical semantics, can be tainted with uncertainty. A main source of difficulty lies in the failure to properly handle the discrete and the continuous parts of systems, and their interaction. How the propagation of mode changes and resets should be handled? How to avoid artifacts due to the use of a global ODE solver causing unwanted coupling between seemingly non interacting subsystems? Also, the mixed use of an equational style for the continuous dynamics with an imperative style for the mode changes and resets is a source of difficulty when handling parallel composition. It is therefore not uncommon that tools return complex warnings for programs with many different suggested hints for fixing them. Yet, these “pathological” programs can still be executed, if wanted so, giving surprising results — See for instance the Simulink examples in , , .

Indeed this area suffers from the same difficulties that led to the development of the theory of synchronous languages as an effort to fix obscure compilation schemes for discrete time equation based languages in the 1980's. Our vision is that hybrid systems modeling tools deserve similar efforts in theory as synchronous languages did for the programming of embedded systems.

System companies such as automotive and aeronautic companies are facing significant difficulties due to the exponentially raising complexity of their products coupled with increasingly tight demands on functionality, correctness, and time-to-market. The cost of being late to market or of imperfections in the products is staggering as witnessed by the recent recalls and delivery delays that many major car and airplane manufacturers had to bear in the recent years. The specific root causes of these design problems are complex and relate to a number of issues ranging from design processes and relationships with different departments of the same company and with suppliers, to incomplete requirement specification and testing.

We believe the most promising means to address the challenges in systems engineering is to employ structured and formal design methodologies that seamlessly and coherently combine the various viewpoints of the design space (behavior, space, time, energy, reliability, ...), that provide the appropriate abstractions to manage the inherent complexity, and that can provide correct-by-construction implementations. The following technology issues must be addressed when developing new approaches to the design of complex systems:

The overall design flows for heterogeneous systems and the associated use of models across traditional boundaries are not well developed and understood. Relationships between different teams inside a same company, or between different stake-holders in the supplier chain, are not well supported by solid technical descriptions for the mutual obligations.

System requirements capture and analysis is in large part a heuristic process, where the informal text and natural language-based techniques in use today are facing significant challenges. Formal requirements engineering is in its infancy: mathematical models, formal analysis techniques and links to system implementation must be developed.

Dealing with variability, uncertainty, and life-cycle issues, such as extensibility of a product family, are not well-addressed using available systems engineering methodologies and tools.

The challenge is to address the entire process and not to consider only local solutions of methodology, tools, and models that ease part of the design.

*Contract-based design* has been proposed as a new approach to
the system design problem that is rigorous and effective in dealing
with the problems and challenges described before, and that, at the
same time, does not require a radical change in the way industrial
designers carry out their task as it cuts across design flows of
different type.
Indeed, contracts can be used almost everywhere and at nearly all
stages of system design, from early requirements capture, to embedded
computing infrastructure and detailed design involving circuits and
other hardware. Contracts explicitly handle pairs of properties,
respectively representing the assumptions on the environment and the
guarantees of the system under these assumptions. Intuitively, a
contract is a pair

Mathematical foundations for interfaces and requirements engineering that enable the design of frameworks and tools;

A system engineering framework and associated methodologies and tool sets that focus on system requirements modeling, contract specification, and verification at multiple abstraction layers.

A detailed bibliography on contract and interface theories for embedded system design can be found in . In a nutshell, contract and interface theories fall into two main categories:

By explicitly relying on the notions
of assumptions and guarantees, A/G-contracts are intuitive, which
makes them appealing for the engineer. In A/G-contracts, assumptions
and guarantees are just properties regarding the behavior of a
component and of its environment. The typical case is when these
properties are formal languages or sets of traces, which includes
the class of safety
properties , , , , . Contract
theories were initially developed as specification formalisms able
to refuse some inputs from the
environment . A/G-contracts were advocated by
the Speeds project . They were further
experimented in the framework of the CESAR
project , with the additional consideration of
*weak* and *strong* assumptions. This is still a very
active research topic, with several recent contributions dealing
with the timed and
probabilistic , viewpoints in system
design, and even mixed-analog circuit design .

Interfaces combine assumptions
and guarantees in a single, automata theoretic specification. Most
interface theories are based on Lynch Input/Output
Automata , . Interface
Automata , , ,
focus primarily on parallel composition and compatibility: Two
interfaces can be composed and are compatible if there is at least
one environment where they can work together. The idea is that the
resulting composition exposes as an interface the needed information
to ensure that incompatible pairs of states cannot be reached. This
can be achieved by using the possibility, for an Interface
Automaton, to refuse selected inputs from the environment in a given
state, which amounts to the implicit assumption that the environment
will never produce any of the refused inputs, when the interface is
in this state. Modal Interfaces
inherit from both Interface Automata and the originally unrelated
notion of Modal Transition
System , , , . Modal
Interfaces are strictly more expressive than Interface Automata by
decoupling the I/O orientation of an event and its deontic
modalities (mandatory, allowed or forbidden). Informally, a
*must* transition is available in every component that realizes
the modal interface, while a *may* transition needs not
be. Research on interface theories is still very active. For
instance,
timed , , , , , ,
probabilistic ,
and energy-aware interface theories have
been proposed recently.

Requirements Engineering is one of the major concerns in large systems industries today, particularly so in sectors where certification prevails . DOORS files collecting requirements are poorly structured and cannot be considered a formal modeling framework today. They are nothing more than an informal documentation enriched with hyperlinks. As examples, medium size sub-systems may have a few thousands requirements and the Rafale fighter aircraft has above 250,000 of them. For the Boeing 787, requirements were not stable while subcontractors performed the development of the fly-by-wire subsystem.

We see Contract-Based Design and Interfaces Theories as innovative tools in support of Requirements Engineering. The Software Engineering community has extensively covered several aspects of Requirements Engineering, in particular:

the development and use of large and rich *ontologies*; and

the use of Model Driven Engineering technology for the structural aspects of requirements and resulting hyperlinks (to tests, documentation, PLM, architecture, and so on).

Behavioral models and properties, however, are not properly encompassed by the above approaches. This is the cause of a remaining gap between this phase of systems design and later phases where formal model based methods involving behavior have become prevalent—see the success of Matlab/Simulink/Scade technologies. We believe that our work on contract based design and interface theories is best suited to bridge this gap.

Non-Standard analysis plays a central role in our research on hybrid systems modeling , , , . The following text provides a brief summary of this theory and gives some hints on its usefulness in the context of hybrid systems modeling. This presentation is based on our paper , a chapter of Simon Bliudze's PhD thesis , and a recent presentation of non-standard analysis, not axiomatic in style, due to the mathematician Lindström .

Non-standard numbers allowed us to reconsider the semantics of hybrid
systems and propose a radical alternative to the *super-dense
time semantics* developed by Edward Lee and his team as part of the
Ptolemy II project, where cascades of successive instants can occur in
zero time by using *infinitesimal* and *non-standard
integers* is such that 1/ *non-standard semantics*
provides a framework that is familiar to the computer
scientist and at the same time efficient as a symbolic
abstraction. This makes it an excellent candidate for the development
of provably correct compilation schemes and type systems for hybrid
systems modeling languages.

Non-standard analysis was proposed by Abraham Robinson in the 1960s to allow the explicit manipulation of “infinitesimals” in analysis , , . Robinson's approach is axiomatic; he proposes adding three new axioms to the basic Zermelo-Fraenkel (ZFC) framework. There has been much debate in the mathematical community as to whether it is worth considering non-standard analysis instead of staying with the traditional one. We do not enter this debate. The important thing for us is that non-standard analysis allows the use of the non-standard discretization of continuous dynamics “as if” it was operational.

Not surprisingly, such an idea is quite ancient. Iwasaki et al. first proposed using non-standard analysis to discuss the nature of time in hybrid systems. Bliudze and Krob , have also used non-standard analysis as a mathematical support for defining a system theory for hybrid systems. They discuss in detail the notion of “system” and investigate computability issues. The formalization they propose closely follows that of Turing machines, with a memory tape and a control mechanism.

The introduction to non-standard analysis in is very
pleasant and we take the liberty to borrow it. This presentation was
originally due to Lindstrøm, see . Its interest is that it
does not require any fancy axiomatic material but only makes use of
the axiom of choice — actually a weaker form of it. The proposed
construction bears some resemblance to the construction of

We begin with an intuitive introduction to the construction of the
non-standard reals.
The goal is to augment

A first idea is to represent such additional numbers as convergent
sequences of reals. For example, elements infinitesimally close to the
real number zero are the sequences

Unfortunately, this way of defining *exactly one of the above sets is important and the
other two can be neglected*. This is achieved by fixing once and for
all a finitely additive positive measure

Now, once

For *filter*

the empty set does not belong to

Consequently, *ultra-filter*. At this point we
recall Zorn's lemma, known to be equivalent to the axiom of choice:

**Lemma 1 (Zorn's lemma)**
*Any partially ordered set $(X,\le )$ such that any chain in $X$
possesses an upper bound has a maximal element.*

A filter *free*
filter, meaning it contains no finite set. It can thus be extended to
a free ultra-filter over

**Lemma 2**
*Any infinite set has a free ultra-filter.*

Every free ultra-filter *not* true that

Now, fix an infinite set

**Lemma 3 (Transfer Principle)**
*Every first order formula is true over ${}^{*\phantom{\rule{-0.166667em}{0ex}}}\phantom{\rule{0.166667em}{0ex}}\mathbb{X}$ iff it is true over $\mathbb{X}$.*

The above general construction can simply be applied to *standard part* of

To prove this, let

It is also of interest to apply the general construction
() to *non-standard natural numbers*.
The non-standard set *infinite natural numbers,* which are
equivalence classes of sequences of integers whose essential limit is

Any sequence

A function *internal*.
Properties of and operations on ordinary
functions extend point-wise to internal functions of
*non-standard version* of
*hyperfinite* if *cardinal*

Now, consider an infinite number

By definition, if

hence *sum* of

If

Now,

Under the same assumptions, for any

Now, consider the following ODE:

Assume () possesses a solution

The substitution in () of

By (), the solutions *non-standard
operational semantics* for ODE (); one which depends on
the choice of infinitesimal step parameter *standardization principle*.

Mica is an Ocaml library developed by Benoît Caillaud implementing the Modal Interface algebra published in , . The purpose of Modal Interfaces is to provide a formal support to contract based design methods in the field of system engineering. Modal Interfaces enable compositional reasoning methods on I/O reactive systems.

In Mica, systems and interfaces are represented by extension. However, a careful design of the state and event heap enables the definition, composition and analysis of reasonably large systems and interfaces. The heap stores states and events in a hash table and ensures structural equality (there is no duplication). Therefore complex data-structures for states and events induce a very low overhead, as checking equality is done in constant time.

Thanks to the Inter module and the mica interactive environment, users can define complex systems and interfaces using Ocaml syntax. It is even possible to define parameterized components as Ocaml functions.

Mica is available as an open-source distribution, under the CeCILL-C
Free Software License Agreement
(http://

Flipflop is a Test and Flip net synthesis tool implementing a linear
algebraic polynomial time algorithm. Computations are done in the

This software has been designed in the context of the S3PM project (see Section ).

Explicit hybrid systems modelers like Simulink / Stateflow allow for
programming both discrete- and continuous-time behaviors with complex
interactions between them. A key issue in their compilation is the
static detection of algebraic or causality loops. Such loops can cause
simulations to deadlock and prevent the generation of statically
scheduled code. We have addressed this issue for a hybrid modeling
language that combines synchronous Lustre-like data-flow equations
with Ordinary Differential Equations
(ODEs) , . We
introduce the operator

Hybrid systems modelers exhibit a number of difficulties related to the mix of continuous and discrete dynamics and sensitivity to the discretization scheme. Modular modeling, where subsystems models can be simply assembled with no rework, calls for using Differential Algebraic Equations (DAE). In turn, DAE are strictly more difficult than ODE. They require sophisticated pre-processing using various notions of index before they can be submitted to a solver. We have studied some fundamental issues raised by the modeling and simulation of hybrid systems involving DAEs . The objective of this work is to serve for the evolution and the design of future releases of the Modelica language for such systems. We focus on the following questions:

What is the proper notion of index for a hybrid DAE system?

What are the primitive statements needed for a DAE hybrid systems modeler?

The differentiation index for DAE explicitly relies on everything being differentiable. Therefore, generalizations to hybrid systems must be done with caution. We propose to rely on non-standard analysis for this. Non-standard analysis formalizes differential equations as discrete step transition systems with infinitesimal time basis. We can thus bring hybrid DAE systems to their non-standard form, where the notion of difference index can be firmly used. From this study, general hints for future releases of Modelica can be drawn.

Surgical process modeling aims at providing an explicit representation
of surgical procedural knowledge. *Surgical process models* are
inferred from a set of surgical procedure recordings, and represent in
a concise manner concurrency, causality and conflict relations between
actions. In the context of the S3PM project
(Section ), we have investigated the use of
*test and flip* nets, a mild extension of flip-flop nets, to
represent surgical process models. A test and flip net synthesis
algorithm, based on linear algebraic methods in the

Ayman Aljarbooh's PhD is partially funded by a ARED grant of the Brittany Regional Council.

Benoît Caillaud is participating to the S3PM project of the
CominLabs excellence
laboratory

Program:« Briques génériques du logiciel embarqué » (Embedded Software Generic Building-Blocks)

Project acronym: Sys2soft

Project title: Physics Aware Software

Duration: June 2012 – April 2016

Coordinator: Dassault Systèmes (France)

Other partners: Thales TGS / TRT / TAS, Alstom Transport, Airbus, DPS, Obeo, Soyatec

Abstract: The Sys2soft project aims at developping methods and tools supporting the design of embedded software interacting with a complex physical environment. The project advocates a methodology where both physics and software are co-modeled and co-simulated early in the design process and embedded code is generated automatically from the joint physics and software models. Extensions of the Modelica language with synchronous programming features are being investigated, as a unified framework where interacting physical and software artifacts can be modeled.

Program: ITEA2

Project acronym: Modrio

Project title: Model Driven Physical Systems Operation

Duration: September 2012 – November 2015

Coordinator: EDF (France)

Other partners: ABB (Sweden), Ampère Laboratory / CNRS (France), Bielefeld University (Germany), Dassault Systèmes (Sweden), Dassault Aviation (France), DLR (Germany), DPS (France), EADS (France), Equa Simulation (Sweden), IFP (France), ITI (Germany), Ilmenau University (Germany), Katholic University of Leuven (Belgium), Knorr-Bremse (Germany), LMS (France and Belgium), Linköping University (Sweden), MathCore (Sweden), Modelon (Sweden), Pöry (Finland), Qtronic (Germany), SICS (Sweden), Scania (Sweden), Semantum (Finland), Sherpa Engineering (France), Siemens (Germany and Sweden), Simpack (Germany), SKF (Sweden), Supmeca (France), Triphase (Belgium), University of Calabria (Italy), VTT (Finland), Vattenfall (Sweden), Wapice (Finland).

Abstract: Modelling and simulation are efficient and widely used tools for system design. But they are seldom used for systems operation. However, most functionalities for system design are beneficial for system operation, provided that they are enhanced to deal with real operating situations. Through open standards the benefits of sharing compatible information and data become obvious: improved cooperation between the design and the operation communities, easier adaptation of operation procedures wrt. design evolutions. Open standards also foster general purpose technology. The objective of the ITEA 2 MODRIO project is to extend modelling and simulation tools based on open standards from system design to system operation.

Beyond the Modrio and Sys2soft collaborative projects, we have an
informal but sustained collaboration with the Dassault Systèmes team
developping the Dymola tool, located in Lund, Sweden, and with the DLR
in Munich, Germany, which are both prominent actors of the Modelica
association. This collaboration has allowed us to have an impact on
the recent evolution of the Modelica language: Version 3.3 of the
language integrates several of our contributions related to the
introduction of language constructs inherited from synchronous
programming languages

Benoît Caillaud has served in the steering and program committees of the International Conference on Application of Concurrency to System Design (ACSD'13) and of the Applications of Regions Theory (ART'13) satellite workshop. He is serving on the Evaluation Committee of Inria.

Benoît Caillaud has contributed to the training programme for
the computer-science option of the *agrégation* in mathematics,
at ENS Cachan-Ker Lann.

PhD in progress Ayman Aljarbooh, *Scalable Simulation of Hybrid Systems: Language Design and Compilation*, started December 2013, supervised by Benoît Caillaud

Benoît Caillaud has participated to the jury for the defense of
Florent Avellaneda's PhD thesis, *Verification of stateful
Petri-nets under a partial order semantics*, December 10th 2013,
Aix-Marseille University. He has also served on the junior researcher
hiring committee of Inria Sophia Antipolis - Méditerranée