Section: Scientific Foundations
Keywords : Compilation, synthesis, formal verification.
Transformations and analysis
The fact that syntactic constructs in synchronous languages are fully defined in terms of corresponding semantic operations immediately pays off in that all kinds of compilation/synthesis, analysis and verification, or optimization methods can readily be caracterized as formal transformations of mathematical models  . Therefore we realized, far in advance, the announced programme of Model-driven Architecture, in our case with true meaningful algorithmic transformations. To this end we extensively use a number of well-defined mathematical models, such as (hierarchical) finite-state Mealy machines, synchronous circuits as Boolean gate netlists, or data-control flowgraphs. The various steps of compilation (or static analysis, or dynamic analysis seen as model-checking, or optimization at each model level) are always formally defined, and thus correct-by-construction , discarding altogether issues of ``synthetizability subsets'', or discrepancy between simulation, that limit the application of syntactic formalisms without clear unambiguous semantics in the field.
Synchronous languages raise specific issues regarding efficient compilation of programs onto various targets (when producing hardware descriptions, one usually talks of ``synthesis'' rather than compilation). This is due to the strong demand on semantic preservation across models. The case of compilation onto distributed embedded architectures, where compilation requires an architecture model description and fancy mapping/scheduling techniques shall be further described as part of the AAA methodology.
Concerning Esterel/SyncCharts, compilation was first realized in the 1980's as an expansing into flat global Mealy FSMs; this produces efficient, but often unduly large code size. Then in the 1990's a translation was defined into Boolean equation systems (BES), with Boolean register memories encoding active control points. While such models are known in the hardware design community as Boolean gate netlists , they can be used in our context for software code production. Here the code produced in quasi-linear in size (worst-case quadratic in rare cases), but the execution consists in a linear evaluation of the whole equation system (thus each reaction requires an exceution time proportional to the whole program, even when only a small fragment is truly active). Thus in the early 2000's new implementation frameworks were thought, relying on high-level control-data flowgraphs selecting the active parts before execution at each instant. This scheme is both fast and memory-efficient, but cannot cope with all programs (as the full constructive causality analysis underlying the synchronous assumption cannot then be realized at ``compile time'', and this check is of utterly importance for program correctness). Correctnes is in this context ensured by a stronger, more restrictive acyclicity criterion, which provides a static evaluation order for signal propagation.
Some of the specific issues of Esterel/SyncCharts compilation are:
- the potential existence of instantaneous loops , diverging so that the instant never reaches completion (``non-zeno'' behaviors);
- shizophrenic problems, where signals and other variables may assume several distinct values, and be instanciated multiply in the same reaction. This calls for program fragment reincarnation for efficient implementation (with single-static assignment techniques);
- constructive causality issues, when a static order to signal propagation cannot be defined, as this order may vary through time. This may call for the symbolic computation of the reachable state space (RSS) covering all potential control configurations, to establish computability of behaviors.
In fact these issues are sometimes less specific than can be thought on first glance, and the interplay between our specialized techniques, and more general compilation methods, is a subject of current studies. This si useful to help synchronous languages get rid of this false ``niche topic'' image. In particular, static analysis techniques are getting momentum in the justification of several compilation steps for Esterel, allowing to smoothly derive compiler specification from (extended) formal semantics, thus paving the way to true mathematical compiler certification.
One advantage of compilation schemes transforming programs from formal models to formal models is that one can benefit (and sometimes contribute) to a rich body of optimization and analysis techniques developed for these models. Circuit and FSM optimization/minimization, equivalence-checking and temporal logic model-checking are instances of this.
Dynamic analysis, automatic verification and model-checking
Formal finite state semantics paves the way to model-checking, which has reached its largest success in synchronous hardware verification. Here the most famous tool is SMV, a symbolic BDD-based model-checker developed at CMU (E. Clarke, K. McMillan), although a number of other similar tools exists, both in public domain or industrial proprietary versions. Recently a standardization of temporal property syntax named Sugar was achieved at international level. In the past dedicated model-checkers were developed for all synchronous languages: Xeve for Esterel, Lesar/NBAC for Lustre, Sigali for Signal. In all cases they are symbolic BDD-based model-checkers, avoiding temporal logic formalisms by stating generic properties, and adding synchronous parallel processes to test and monitor observed systems so that large classes of properties can be reduced to generic ones on combined systems. In the last decade model-checking techniques, mostly based on symbolic state space constructions, were extended to cover other problems than mere satisfaction of properties. In Esterel they were used to enforce ``state-conscious'' constructive causality, and in designing more efficient (but costly) optimization techniques on circuit descriptions. In Lustre, and then Esterel, they were used for automatic test pattern generation with the aim of covering specified portions of the state space (actually those techniques using explicit state spaces originated in the context of asynchronous languages, in the work of INRIA Pampa team for instance). In Signal they were extended to the problem of controller synthesis , in which a synchronous parallel supervisor is algorithmically built to keep the observed system from reaching pathological or forbidden states.
Recently, new techniques for ``bounded'' model-checking using efficient SAT-solvers were introduced (SAT being the well-known SATisfiability problem for propositional logic). They sacrify completeness (one must know ``up to what depth'' the property is meant to hold, or provide non-automatically some sort of recursion invariant assumption to be proved) to the sake of efficiency; in fact the economic pressure of hardware verification led to new advances in the algorithmic heuristics involved in SAT-solving methods, with the so-called Stalmark method in Sweden (by Prover Technologies), and iterative learning methods as implemented in Schaff (Malik, Princeton U.), BerkMin and predecessors.