## Section: New Results

### Improvement of theoretical foundations

#### Term and graph strategic rewriting

Participants : Emilie Balland, Tony Bourdier, Horatiu Cirstea, Hélène Kirchner.

From our previous work on biochemical applications, the structure of port graph and a rewriting calculus have proved to be well-suited formalisms for modeling interactions between proteins. Port graphs as graphs with multiple edges and loops, with nodes having explicit connection, points, called ports, and edges attaching to ports of nodes. In [36] , port graphs have been proposed as a formal model for distributed resources and grid infrastructures, where each resource is modeled by a node with ports. The lack of global information and the autonomous and distributed behavior of components are modeled by a multiset of port graphs and rewrite rules which are applied locally, concurrently, and non-deterministically. Some computations take place wherever it is possible and in parallel, while others may be controlled by strategies. In [18] , we then define an abstract biochemical calculus that instantiates to a rewrite calculus on these graphs. Rules and strategies are themselves port graphs, i.e. first-class objects of the calculus. As a consequence, they can be rewritten as well, and rules can create new rules, providing a way of modeling adaptive systems. This approach also provides a formal framework to reason about computations and to verify useful properties. We show how structural properties of a modeled system can be expressed as strategies and checked for satisfiability at each step of the computation. This provides a way to ensure invariant properties of a system. This work is a contribution to the formal specification and verification of adaptive systems and to theoretical foundations of autonomic computing.

Term-graph rewriting corresponds to an extension of term rewriting to deal with terms that can contain cycles and shared subterms. Based on the formalization of paths and referenced terms, we defined [10] referenced term rewriting as a simulation of term-graph rewriting. Since this simulation is completely based on standard first-order terms, the main interest of this approach is to provide a safe and efficient way to represent and transform term-graphs in a purely term rewriting based language.

In [57] , we introduce the notion of abstract strategies for abstract reduction systems. Adequate properties of termination, confluence and normalization under strategy can then be defined. Thanks to this abstract concept, we draw a parallel between strategies for computation and strategies for deduction. We define deduction rules as rewrite rules, a deduction step as a rewriting step and a proof construction step as a narrowing step in an adequate abstract reduction system. Computation, deduction and proof search are thus captured in the uniform foundational concept of abstract reduction system in which abstract strategies have a clear formalisation.

In the continuation of this work, we proposed in [20] an alternative definition of abstract reduction systems and introduced a general definition of abstract strategies which is extensional in the sense that a strategy is defined explicitly as a set of derivations of an abstract reduction system. We introduce also a more intensional definition supporting the abstract view but more operational in the sense that it describes a means for determining such a set. We characterize the class of extensional strategies that can be defined intensionally. We also give some hints towards a logical characterization of intensional strategies.

#### Algebraic and topological properties of rewriting systems

Participant : Yves Guiraud.

The property of *finite derivation type* is a homotopical property of rewriting systems [70] . Intuitively, when a rewriting system has finite derivation type, there are only finitely many non-trivial choices one can make in any given computation. A family of such elementary choices is a *homotopy basis* of the rewriting system. In a joint work with Philippe Malbos (Université Lyon 1) we have studied this propery for presentations of n -categories by polygraphs.

We have recovered Squier's results on presentations of monoids by word rewriting systems. We have proved, with the counter-example given in Figure 3 , that the existence of a finite convergent presentation was not sufficient enough to guarantee that an n -category has finite derivation type, starting with n = 2 .

However, we have identified an extra condition, *finite indexation* , and proved that a 2-category with a presentation by a finite, convergent and finitely indexed 3-polygraph has finite derivation type. Usual 3-polygraphs have this property: in particular, the canonical translation of a left-linear term rewriting system into a 3-polygraph is always finitely indexed; as a consequence, an equational theory must have finite derivation type to admit a presentation by a first-order functional program. Finally, this work gives a general setting to express and prove coherence theorems, such as Mac Lane's one for monoidal categories [62] , by using simple rewriting methods. This work has been published in [15] .

Then we have generalized the notion of *identities among relations* to polygraphs, giving an algebraic interpretation of the previous results. We have proved that the elements of a homotopy basis of a polygraph yield a generating set for its natural system of identities among relations. As a consequence, if a polygraph has finite derivation type, then its identities among relations are finitely generated. This work is contained in [30] and has been presented to the Workshop on Computer Algebra Methods and Commutativity of Algebraic Diagrams, held in Toulouse.

We are currently working on a generalization of Fox differential calculus for n -categories. The objective is to use polygraphs with good computational properties to build small resolutions of n -categories, allowing one to concretely compute the (co)homology groups of a given n -category. This work fits into the project of studying complexity classes of programs, characterized in terms of derivations of 2-categories, by means of cohomological methods, since the first cohomology group of the canonical resolution of any usual algebraic object classifies its derivations.

#### Mechanized deduction

Participants : Paul Brauner, Guillaume Burel, Clément Houtmann, Claude Kirchner, Hélène Kirchner, Cody Roux.

Higher order rewrite systems are useful abstractions to model both operational semantics of programming languages and equational reasoning in certain theories. The termination of these systems is very useful for proving correctness of programs, finding decision procedures, and proving consistency of certain theorem proving systems based on type theory.

A well studied method of proving termination is *size based
termination* which allows comparison of sizes of arguments in
recursive calls by typing. However the semantics of such systems is
unclear, as is their relation to other techniques of termination. in
[19] , together with Frederic Blanqui,
we show that a certain kind of algebraic semantics can be given to
such typing systems, and that termination can indeed be shown as a
special case of *higher-order semantic labelling* , combined with
a simple precedence argument. This gives a uniform termination proof
to a large number of possible size-based termination systems.

We have investigated how, starting from an axiomatic presentation of a theory, it is possible to present it by means of rewrite rules so that it can be used in deduction modulo. To ensure good proof-theoretical properties as well as the completeness of the proof-search procedures based on deduction modulo, the resulting rewrite system must be so that the cut rule is admissible in deduction modulo. In classical logic, this can always be done, first by transforming the axioms into a rewrite system and then by completing this rewrite system using a Knuth-Bendix like procedure to recover the cut admissibility [13] . However, in intuitionistic logic, there are theories which cannot be transformed into a rewrite system with the cut admissibility. We show that it even undecidable to know if it is possible. Nonetheless, by interleaving the transformation of axioms into rewrite rules and the cut-admissibility-recovering completion, we can propose a (non-terminating, possibly failing) procedure that works on a large class of theories [22] .

These results show how deduction modulo can be used to get better automation in proofs. In general, we investigated how deduction modulo and superdeduction provides better proofs, by studying three simplicity criteria: cut admissibility, proof length, and expressiveness [11] .

In [17] , we present an original narrowing-based proof search method for inductive theorems in equational rewrite theories given by a rewrite system and a set E of equalities. It has the specificity to be grounded on deduction modulo and to rely on narrowing to provide both induction variables and instantiation schemas. Whenever the equational rewrite system has good properties of termination, sufficient completeness, and when E is constructor and variable preserving, narrowing at defined-innermost positions leads to consider only unifiers which are constructor substitutions. This is especially interesting for associative and associative-commutative theories for which the general proof search system is refined. The method is shown to be sound and refutationaly correct and complete. A major feature of our approach is to provide a constructive proof in deduction modulo for each successful instance of the proof search procedure.

In [31] we describe a presentation of sequents in a two-dimensional space as well as a presentation of proofnets and sequent calculus derivations in a three-dimensional space. These renderings admit interesting geometrical properties: sequent occurrences appear as parallel segments in the case of three-dimensional sequent calculus derivations and the De Morgan duality is expressed by the fact that negation stands for a ninety degree rotation in the case of two-dimensional sequents and three-dimensional proofnets.