Section: New Results
Implicit Computational Complexity
Participants : Jean-Yves Marion, Guillaume Bonfante, Romain Péchoux, Walid Gomaa, Emmanuel Hainry.
The goal of implicit computational complexity is to give ontogenetic models of computational complexity. We follow two lines of research. The first line is more theoretical and is related to the initial ramified recursion theory due to Leivant and Marion and to light linear logic due to Girard. The second is more practical and is related to interpretation methods, quasi-interpretation and sup-interpretation, in order to provide an upper bound on some computational resources, which are necessary for a program execution. This approach seems to have some practical interests, and we develop a software Crocus that automatically infer complexity upper bounds of functional programs.
ICC Core
In [14] , Guillaume Bonfante and Yves Guiraud have studied the computational model of polygraphs. For that, we consider polygraphic programs, a subclass of these objects, as a formal description of first-order functional programs. We explain their semantics and prove that they form a Turing-complete computational model. Their algebraic structure is used by analysis tools, called polygraphic interpretations, for complexity analysis. In particular, we delineate a subclass of polygraphic programs that compute exactly the functions that are Turing-computable in polynomial time.
Jean-Yves Marion [17] refines predicative analysis, on which ICC foundation leans, by using a ramified Ackermann's
construction of a non-primitive recursive function. We obtain a
hierarchy of functions which characterizes exactly functions, which
are computed in O(nk) time over register machine model of computation.
For this, we introduce a strict ramification principle.
Then, we show how to diagonalize in order to obtain an
exponential function and to jump outside .
Lastly, we suggest a dependent typed lambda-calculus to represent this construction.
Interpretation methods
Guillaume Bonfante, together with Florian Deloup and Antoine Henrot have reconsidered the use of reals in the context of interpretation of programs in [22] . The main issue is that the ordering over the reals is not well-founded, and, consequently, bounds on the length of computations are lost. Actually, bounds on the size of terms are also lost. The contribution is to show that these bounds can be recovered when one uses interpretations defined by the functions max and polynomials. This comes from the Positivstellensatz, a deep result of algebraic geometry.
The sup-interpretation method is proposed as a new tool to control memory resources of first order functional programs with pattern matching by static analysis [18] . Basically, a sup-interpretation provides an upper bound on the size of function outputs. A criterion, which can be applied to terminating as well as non-terminating programs, is developed in order to bound polynomially the stack frame size. Sup-interpretations are proposed by Jean-Yves Marion and Romain Péchoux. Sup-interpretations may be used in various programming setting like object oriented language programming [91] .
Recursive analysis
Olivier Bournez, Walid Gomaa and Emmanuel Hainry investigated the notion of implicit complexity in the framework of recursive analysis. In [35] , was presented a characterization of polynomial-time computable functions as well as a framework to extend classical complexity result to the real field. This characterization is the first implicit characterization of this class of functions and as such opens the field of implicit complexity in recursive analysis.