The general orientation of our team is described by the short name given to it:
*Special Functions*, that is, particular mathematical functions that have
established names due to their importance in mathematical analysis, physics, and
other application domains. Indeed, we ambition to study special functions with
the computer, by combined means of computer algebra and formal methods.

Computer-algebra systems have been advertised for decades as software
for “doing mathematics by computer” . For
instance, computer-algebra libraries can uniformly generate a corpus
of mathematical properties about special functions, so as to display
them on an interactive website. This possibility was recently shown by the
computer-algebra component of the
team . Such
an automated generation significantly increases the reliability of the
mathematical corpus, in comparison to the content of existing static
authoritative handbooks. The importance of the validity of these
contents can be measured by the very wide audience that such handbooks
have had, to the point that a book
like remains one of the most cited
mathematical publications ever and has motivated the 10-year-long
project of writing its
successor .
However, can the mathematics produced “by computer” be considered as
*true* mathematics? More specifically, whereas it is nowadays
well established that the computer helps in discovering and observing
new mathematical phenomenons, can the mathematical statements produced
with the aid of the computer and the mathematical results computed by
it be accepted as valid mathematics, that is, as having the status of
mathematical *proofs*?
Beyond the reported weaknesses or
controversial design choices of mainstream computer-algebra systems,
the issue is more of an epistemological nature. It will not find its
solution even in the advent of the ultimate computer-algebra system:
the social process of peer-reviewing just falls short of evaluating
the results produced by computers, as reported by
Th. Hales after the publication of his proof of
the Kepler Conjecture about sphere packing.

A natural answer to this deadlock is to move to an alternative kind of
mathematical software and to use a proof assistant to check the
correctness of the desired properties or formulas. The success
of large-scale formalization projects, like the Four-Color Theorem of
graph theory , the above-mentioned Kepler
Conjecture , and the Odd Order
Theorem of group theory

The Dynamic Dictionary of Mathematical Functions

The formal-proofs component of the team emanates from another project
of the MSR–Inria Joint Centre, namely the Mathematical Components
project (MathComp)

The present team takes benefit from these recent advances to explore the formal certification of the results collected in DDMF. The aim of this project is to concentrate the formalization effort on this delimited area, building on DDMF and the Algolib library, as well as on the Coq system and on the libraries developed by the MathComp project.

The following few opinions on computer algebra are, we believe, typical of computer-algebra users' doubts and difficulties when using computer-algebra systems:

Fredrik Johansson, expert in the multi-precision numerical evaluation
of special functions and in fast computer-algebra algorithms, writes
on his blog : “Mathematica is great for
cross-checking numerical values, but it's not unusual to run into
bugs, so *triple checking is a good habit*.” One answer in the
discussion is: “We can claim that Mathematica has [...] *an
impossible to understand semantics*: If Mathematica's output is
wrong then change the input. If you don't like the answer, change the
question. That seems to be the philosophy behind.”

Jacques Carette, former head of the maths group at Maplesoft, about a
bug when asking Maple to take the limit
`limit(f(n) * exp(-n), n = infinity)` for an undetermined
function `f`: “The problem is that there is an *implicit
assumption in the implementation* that unknown functions do not
`grow too fast'.”

As explained by the expert views above, complaints by computer-algebra users are often due to their misunderstanding of what a computer-algebra systems is, namely a purely syntactic tool for calculations, that the user must complement with a semantics. Still, robustness and consistency of computer-algebra systems are not ensured as of today, and, whatever Zeilberger may provocatively say in his Opinion 94 , a firmer logical foundation is necessary. Indeed, the fact is that many “bugs” in a computer-algebra system cannot be fixed by just the usual debugging method of tracking down the faulty lines in the code. It is sort of “by design”: assumptions that too often remain implicit are really needed by the design of symbolic algorithms and cannot easily be expressed in the programming languages used in computer algebra. A similar certification initiative has already been undertaken in the domain of numerical computing, in a successful manner , . It is natural to undertake a similar approach for computer algebra.

Some of the mathematical objects that interest our team are still totally untouched by formalization. When implementing them and their theory inside a proof assistant, we have to deal with the pervasive discrepancy between the published literature and the actual implementation of computer-algebra algorithms. Interestingly, this forces us to clarify our computer-algebraic view on them, and possibly make us discover holes lurking in published (human) proofs. We are therefore convinced that the close interaction of researchers from both fields, which is what we strive to maintain in this team, is a strong asset.

For a concrete example, the core of Zeilberger's creative telescoping manipulates rational functions up to simplifications. In summation applications, checking that these simplifications do not hide problematic divisions by 0 is most often left to the reader. In the same vein, in the case of integrals, the published algorithms do not check the convergence of all integrals, especially in intermediate calculations. Such checks are again left to the readers. In general, we expect to revisit the existing algorithms to ensure that they are meaningful for genuine mathematical sequences or functions, and not only for algebraic idealizations.

Another big challenge in this project originates in
the scientific difference between computer algebra and formal proofs.
Computer algebra seeks speed of calculation on *concrete
instances* of algebraic data structures (polynomials, matrices,
etc). For their part, formal proofs manipulate
symbolic expressions in terms of *abstract variables*
understood to represent generic elements of algebraic data
structures. In view of this, a continuous challenge is
to develop the right, hybrid thinking attitude that is able to
effectively manage concrete and abstract values simultaneously,
alternatively computing and proving with them.

Applications in combinatorics and mathematical physics frequently involve equations of so high orders and so large sizes, that computing or even storing all their coefficients is impossible on existing computers. Making this tractable is an extraordinary challenge. The approach we believe in is to design algorithms of good—ideally quasi-optimal—complexity in order to extract precisely the required data from the equations, while avoiding the computationally intractable task of completely expanding them into an explicit representation.

Typical applications with expected high impact are the automatic discovery and algorithmic proof of results in combinatorics and mathematical physics for which human proofs are currently unattainable.

The implementation of certified symbolic computations on special functions in the Coq proof assistant requires both investigating new formalization techniques and renewing the traditional computer-algebra viewpoint on these standard objects. Large mathematical objects typical of computer algebra occur during formalization, which also requires us to improve the efficiency and ergonomics of Coq. In order to feed this interdisciplinary activity with new motivating problems, we additionally pursue a research activity oriented towards experimental mathematics in application domains that involve special functions. We expect these applications to pose new algorithmic challenges to computer algebra, which in turn will deserve a formal-certification effort. Finally, DDMF is the motivation and the showcase of our progress on the certification of these computations. While striving to provide a formal guarantee of the correctness of the information it displays, we remain keen on enriching its mathematical content by developing new computer-algebra algorithms.

Our formalization effort consists in organizing a cooperation between a computer-algebra system and a proof assistant. The computer-algebra system is used to produce efficiently algebraic data, which are later processed by the proof assistant. The success of this cooperation relies on the design of appropriate libraries of formalized mathematics, including certified implementations of certain computer-algebra algorithms. On the other side, we expect that scrutinizing the implementation and the output of computer-algebra algorithms will shed a new light on their semantics and on their correctness proofs, and help clarifying their documentation.

The appropriate framework for the study of efficient algorithms for
special functions is *algebraic*.
Representing algebraic theories as Coq formal libraries
takes benefit from the methodology emerging from the success of
ambitious projects like the formal proof of a major classification
result in finite-group theory (the Odd Order
Theorem) .

Yet, a number of the objects we need to formalize in the present context has never been investigated using any interactive proof assistant, despite being considered as commonplaces in computer algebra. For instance there is up to our knowledge no available formalization of the theory of non-commutative rings, of the algorithmic theory of special-functions closures, or of the asymptotic study of special functions. We expect our future formal libraries to prove broadly reusable in later formalizations of seemingly unrelated theories.

Another peculiarity of the mathematical objects we are going to manipulate
with the Coq system is their size. In order to provide a formal guarantee
on the data displayed by DDMF, two related axes of research have to be
pursued.
First, efficient algorithms dealing with these large objects have
to be programmed and run in Coq.
Recent evolutions of the Coq system to improve the efficiency of
its internal computations , make this objective
reachable. Still, how to combine the aforementioned formalization
methodology with these cutting-edge evolutions of Coq remains
one of the prospective aspects of our project.
A second need is to help users *interactively*
manipulate large expressions occurring in their conjectures, an objective
for which little has been done so far. To address this need,
we work on improving the ergonomics of the system
in two ways: first, ameliorating the reactivity of Coq in its interaction
with the user; second, designing and implementing extensions of its
interface to ease our formalization activity. We expect the outcome of
these lines of research to be useful to a wider audience, interested in
manipulating large formulas on topics possibly unrelated to special functions.

Our algorithm certifications inside Coq intend to simulate
well-identified components of our Maple packages, possibly by
reproducing them in Coq. It would however not have been judicious to
re-implement them inside Coq in a systematic way. Indeed for a number of its
components, the output of the algorithm is more easily checked than
found, like for instance the solving of a linear system.
Rather, we delegate the discovery of the solutions to an
external, untrusted oracle like Maple. Trusted computations inside
Coq then formally validate the correctness of the a priori
untrusted output. More often than not, this validation consists in
implementing and executing normalization procedures *inside*
Coq. A challenge of this automation is to make sure they go to scale
while remaining efficient, which requires a Coq version of
non-trivial computer-algebra algorithms. A first, archetypal example we expect to
work on is a non-commutative generalization of the normalization
procedure for elements of rings .

Generally speaking, we design algorithms for manipulating special functions symbolically, whether univariate or with parameters, and for extracting algorithmically any kind of algebraic and analytic information from them, notably asymptotic properties. Beyond this, the heart of our research is concerned with parametrised definite summations and integrations. These very expressive operations have far-ranging applications, for instance, to the computation of integral transforms (Laplace, Fourier) or to the solution of combinatorial problems expressed via integrals (coefficient extractions, diagonals). The algorithms that we design for them need to really operate on the level of linear functional systems, differential and of recurrence. In all cases, we strive to design our algorithms with the constant goal of good theoretical complexity, and we observe that our algorithms are also fast in practice.

Our long-term goal is to design fast algorithms for a general method
for special-function integration (*creative telescoping*), and
make them applicable to general special-function inputs. Still, our
strategy is to proceed with simpler, more specific classes first
(rational functions, then algebraic functions, hyperexponential
functions, D-finite functions, non-D-finite functions; two variables,
then many variables); as well, we isolate analytic questions by
first considering types of integration with a more purely algebraic
flavor (constant terms, algebraic residues, diagonals of
combinatorics). In particular, we expect to extend our recent
approach to more general classes
(algebraic with nested radicals, for example): the idea is to speed up
calculations by making use of an analogue of Hermite reduction that avoids
considering certificates.
Homologous problems for summation will be addressed as well.

As a consequence of our complexity-driven approach to algorithms design, the algorithms mentioned in the previous paragraph are of good complexity. Therefore, they naturally help us deal with applications that involve equations of high orders and large sizes.

With regard to combinatorics, we expect to advance the algorithmic classification of combinatorial classes like walks and urns. Here, the goal is to determine if enumerative generating functions are rational, algebraic, or D-finite, for example. Physical problems whose modelling involves special-function integrals comprise the study of models of statistical mechanics, like the Ising model for ferro-magnetism, or questions related to Hamiltonian systems.

Number theory is another promising domain of applications. Here, we attempt an experimental approach to the automated certification of integrality of the coefficients of mirror maps for Calabi–Yau manifolds. This could also involve the discovery of new Calabi–Yau operators and the certification of the existing ones. We also plan to algorithmically discover and certify new recurrences yielding good approximants needed in irrationality proofs.

It is to be noted that in all of these application domains, we would so far use general algorithms, as was done in earlier works of ours , , . To push the scale of applications further, we plan to consider in each case the specifics of the application domain to tailor our algorithms.

In continuation of our past project of an encyclopedia at
http://

the algorithmic discussion of equations with parameters, leading to certified automatic case analysis based on arithmetic properties of the parameters;

lists of summation and integral formulas involving special functions, including validity conditions on the parameters;

guaranteed large-precision numerical evaluations.

Computer algebra manipulates symbolic representations of exact mathematical objects in a computer, in order to perform computations and operations like simplifying expressions and solving equations for “closed-form expressions”. The manipulations are often fundamentally of algebraic nature, even when the ultimate goal is analytic. The issue of efficiency is a particular one in computer algebra, owing to the extreme swell of the intermediate values during calculations.

Our view on the domain is that research on the algorithmic manipulation of special functions is anchored between two paradigms:

adopting linear differential equations as the right data structure for special functions,

designing efficient algorithms in a complexity-driven way.

It aims at four kinds of algorithmic goals:

algorithms combining functions,

functional equations solving,

multi-precision numerical evaluations,

guessing heuristics.

This interacts with three domains of research:

computer algebra, meant as the search for quasi-optimal algorithms for exact algebraic objects,

symbolic analysis/algebraic analysis;

experimental mathematics (combinatorics, mathematical physics, ...).

This view is made explicit in the present section.

Numerous special functions satisfy linear differential and/or
recurrence equations. Under a mild technical condition, the existence
of such equations induces a finiteness property that makes the main
properties of the functions decidable. We thus speak of
*D-finite functions*. For example, 60 % of the chapters in the
handbook describe D-finite functions.
In addition, the class is closed under a rich set of algebraic operations.
This makes linear functional equations just the right data structure
to encode and manipulate special functions. The power of this
representation was observed in the early
1990s , leading to the design of many
algorithms in computer algebra.
Both on the theoretical and algorithmic sides, the study of D-finite
functions shares much with neighbouring mathematical domains:
differential algebra,
D-module theory,
differential Galois theory,
as well as their counterparts for recurrence equations.

Differential/recurrence equations that define special functions can be
recombined to define: additions and
products of special functions; compositions of special functions;
integrals and sums involving special functions. Zeilberger's fast
algorithm for obtaining recurrences satisfied by parametrised binomial
sums was developed in the early 1990s already .
It is the basis of all modern definite summation and integration
algorithms. The theory was made fully rigorous and algorithmic in
later works, mostly by a group in Risc (Linz, Austria) and by members
of the
team , , , , , .
The past ÉPI Algorithms contributed several implementations
(*gfun* ,
*Mgfun* ).

Encoding special functions as defining linear functional equations postpones some of the difficulty of the problems to a delayed solving of equations. But at the same time, solving (for special classes of functions) is a sub-task of many algorithms on special functions, especially so when solving in terms of polynomial or rational functions. A lot of work has been done in this direction in the 1990s; more intensively since the 2000s, solving differential and recurrence equations in terms of special functions has also been investigated.

A major conceptual and algorithmic difference exists for numerical
calculations between data structures that fit on a machine word and
data structures of arbitrary length, that is, *multi-precision*
arithmetic. When multi-precision floating-point numbers became
available, early works on the evaluation of special functions were
just promising that “most” digits in the output were correct, and
performed by heuristically increasing precision during intermediate
calculations, without intended rigour. The original theory
has evolved in a
twofold way since the 1990s:
by making computable all constants hidden in asymptotic
approximations, it became possible to guarantee a *prescribed*
absolute precision; by employing state-of-the-art algorithms on
polynomials, matrices, etc, it became possible to have evaluation
algorithms in a time complexity that is linear in the output size, with a
constant that is not more than a few units.
On the implementation side, several original works
exist, one of which (*NumGfun* ) is
used in our DDMF.

“Differential approximation”, or “Guessing”, is an operation to get an ODE likely to be satisfied by a given approximate series expansion of an unknown function. This has been used at least since the 1970s and is a key stone in spectacular applications in experimental mathematics . All this is based on subtle algorithms for Hermite–Padé approximants . Moreover, guessing can at times be complemented by proven quantitative results that turn the heuristics into an algorithm . This is a promising algorithmic approach that deserves more attention than it has received so far.

The main concern of computer algebra has long been to prove the feasibility of a given problem, that is, to show the existence of an algorithmic solution for it. However, with the advent of faster and faster computers, complexity results have ceased to be of theoretical interest only. Nowadays, a large track of works in computer algebra is interested in developing fast algorithms, with time complexity as close as possible to linear in their output size. After most of the more pervasive objects like integers, polynomials, and matrices have been endowed with fast algorithms for the main operations on them , the community, including ourselves, started to turn its attention to differential and recurrence objects in the 2000s. The subject is still not as developed as in the commutative case, and a major challenge remains to understand the combinatorics behind summation and integration. On the methodological side, several paradigms occur repeatedly in fast algorithms: “divide and conquer” to balance calculations, “evaluation and interpolation” to avoid intermediate swell of data, etc. .

Handbooks collecting mathematical properties aim at serving as
reference, therefore trusted, documents. The decision of
several authors or maintainers of such knowledge bases to move from paper
books , , to websites and wikis

Several attempts have been made in order to extend existing computer-algebra systems with symbolic manipulations of logical formulas. Yet, these works are more about extending the expressivity of computer-algebra systems than about improving the standards of correctness and semantics of the systems. Conversely, several projects have addressed the communication of a proof system with a computer-algebra system, resulting in an increased automation available in the proof system, to the price of the uncertainty of the computations performed by this oracle.

More ambitious projects have tried to design a new computer-algebra system providing an environment where the user could both program efficiently and elaborate formal and machine-checked proofs of correctness, by calling a general-purpose proof assistant like the Coq system. This approach requires a huge manpower and a daunting effort in order to re-implement a complete computer-algebra system, as well as the libraries of formal mathematics required by such formal proofs.

The move to machine-checked proofs of the mathematical correctness of the output of computer-algebra implementations demands a prior clarification about the often implicit assumptions on which the presumably correctly implemented algorithms rely. Interestingly, this preliminary work, which could be considered as independent from a formal certification project, is seldom precise or even available in the literature.

A number of authors have investigated ways to organize the communication of a chosen computer-algebra system with a chosen proof assistant in order to certify specific components of the computer-algebra systems, experimenting various combinations of systems and various formats for mathematical exchanges. Another line of research consists in the implementation and certification of computer-algebra algorithms inside the logic , , or as a proof-automation strategy. Normalization algorithms are of special interest when they allow to check results possibly obtained by an external computer-algebra oracle . A discussion about the systematic separation of the search for a solution and the checking of the solution is already clearly outlined in .

Significant progress has been made in the certification of numerical applications by formal proofs. Libraries formalizing and implementing floating-point arithmetic as well as large numbers and arbitrary-precision arithmetic are available. These libraries are used to certify floating-point programs, implementations of mathematical functions and for applications like hybrid systems.

To be checked by a machine, a proof needs to be expressed in a constrained, relatively simple formal language. Proof assistants provide facilities to write proofs in such languages. But, as merely writing, even in a formal language, does not constitute a formal proof just per se, proof assistants also provide a proof checker: a small and well-understood piece of software in charge of verifying the correctness of arbitrarily large proofs. The gap between the low-level formal language a machine can check and the sophistication of an average page of mathematics is conspicuous and unavoidable. Proof assistants try to bridge this gap by offering facilities, like notations or automation, to support convenient formalization methodologies. Indeed, many aspects, from the logical foundation to the user interface, play an important role in the feasibility of formalized mathematics inside a proof assistant.

While many logical foundations for mathematics have been proposed, studied, and implemented, type theory is the one that has been more successfully employed to formalize mathematics, to the notable exception of the Mizar system , which is based on set theory. In particular, the calculus of construction (CoC) and its extension with inductive types (CIC) , have been studied for more than 20 years and been implemented by several independent tools (like Lego, Matita, and Agda). Its reference implementation, Coq , has been used for several large-scale formalizations projects (formal certification of a compiler back-end; four-color theorem). Improving the type theory underlying the Coq system remains an active area of research. Other systems based on different type theories do exist and, whilst being more oriented toward software verification, have been also used to verify results of mainstream mathematics (prime-number theorem; Kepler conjecture).

The most distinguishing feature of CoC is that computation is promoted to the status of rigorous logical argument. Moreover, in its extension CIC, we can recognize the key ingredients of a functional programming language like inductive types, pattern matching, and recursive functions. Indeed, one can program effectively inside tools based on CIC like Coq. This possibility has paved the way to many effective formalization techniques that were essential to the most impressive formalizations made in CIC.

Another milestone in the promotion of the computations-as-proofs feature of Coq has been the integration of compilation techniques in the system to speed up evaluation. Coq can now run realistic programs in the logic, and hence easily incorporates calculations into proofs that demand heavy computational steps.

Because of their different choice for the underlying logic, other proof assistants have to simulate computations outside the formal system, and indeed fewer attempts to formalize mathematical proofs involving heavy calculations have been made in these tools. The only notable exception, which was finished in 2014, the Kepler conjecture, required a significant work to optimize the rewriting engine that simulates evaluation in Isabelle/HOL.

Programs run and proved correct inside the logic are especially useful for the conception of automated decision procedures. To this end, inductive types are used as an internal language for the description of mathematical objects by their syntax, thus enabling programs to reason and compute by case analysis and recursion on symbolic expressions.

The output of complex and optimized programs external
to the proof assistant can also be stamped with a formal proof of
correctness when their result is easier to *check* than to
*find*. In that case one can benefit from their efficiency
without compromising the level of confidence on their output at the
price of writing and certify a
checker inside the logic. This approach, which has been successfully
used in various contexts,
is very relevant to the present research project.

Representing abstract algebra in a proof assistant has been studied for long. The libraries developed by the MathComp project for the proof of the Odd Order Theorem provide a rather comprehensive hierarchy of structures; however, they originally feature a large number of instances of structures that they need to organize. On the methodological side, this hierarchy is an incarnation of an original work based on various mechanisms, primarily type inference, typically employed in the area of programming languages. A large amount of information that is implicit in handwritten proofs, and that must become explicit at formalization time, can be systematically recovered following this methodology.

The MathComp library was consistently designed after uniform principles of software engineering. These principles range from simple ones, like naming conventions, to more advanced ones, like generic programming, resulting in a robust and reusable collection of formal mathematical components. This large body of formalized mathematics covers a broad panel of algebraic theories, including of course advanced topics of finite group theory, but also linear algebra, commutative algebra, Galois theory, and representation theory. We refer the interested reader to the online documentation of these libraries , which represent about 150,000 lines of code and include roughly 4,000 definitions and 13,000 theorems.

Topics not addressed by these libraries and that might be relevant to the present project include real analysis and differential equations. The most advanced work of formalization on these domains is available in the HOL-Light system , , , although some existing developments of interest , are also available for Coq. Another aspect of the MathComp libraries that needs improvement, owing to the size of the data we manipulate, is the connection with efficient data structures and implementations, which only starts to be explored.

The user of a proof assistant describes the proof he wants to formalize in the system using a textual language. Depending on the peculiarities of the formal system and the applicative domain, different proof languages have been developed. Some proof assistants promote the use of a declarative language, when the Coq and Matita systems are more oriented toward a procedural style.

The development of the large, consistent body of MathComp libraries has prompted the need to design an alternative and coherent language extension for the Coq proof assistant , , enforcing the robustness of proof scripts to the numerous changes induced by code refactoring and enhancing the support for the methodology of small-scale reflection.

The development of large libraries is quite a novelty for the Coq system. In particular any long-term development process requires the iteration of many refactoring steps and very little support is provided by most proof assistants, with the notable exception of Mizar . For the Coq system, this is an active area of research.

Our expertise in computer algebra and complexity-driven design of algebraic algorithms has applications in various domains, including:

combinatorics, especially the study of combinatorial walks,

theoretical computer science, like by the study of automatic sequences,

number theory, by the analysis of the nature of so-called periods.

*Dynamic Mathematics on the Web*

Functional Description: Programming tool for controlling the generation of mathematical websites that embed dynamical mathematical contents generated by computer-algebra calculations. Implemented in OCaml.

Participants: Alexis Darrasse, Frédéric Chyzak and Maxence Guesdon

Contact: Frédéric Chyzak

*Encyclopedia of Combinatorial Structures*

Functional Description: On-line mathematical encyclopedia with an emphasis on sequences that arise in the context of decomposable combinatorial structures, with the possibility to search by the first terms in the sequence, keyword, generating function, or closed form.

Participants: Alexis Darrasse, Frédéric Chyzak, Maxence Guesdon and Stéphanie Petit

Contact: Frédéric Chyzak

URL: http://

*Dynamic Dictionary of Mathematical Functions*

Functional Description: Web site consisting of interactive tables of mathematical formulas on elementary and special functions. The formulas are automatically generated by OCaml and computer-algebra routines. Users can ask for more terms of the expansions, more digits of the numerical values, proofs of some of the formulas, etc.

Participants: Alexandre Benoit, Alexis Darrasse, Bruno Salvy, Christoph Koutschan, Frédéric Chyzak, Marc Mezzarobba, Maxence Guesdon, Stefan Gerhold and Thomas Gregoire

Contact: Frédéric Chyzak

*multivariate generating functions package*

Functional Description: The Mgfun Project is a collection of packages for the computer algebra system Maple, and is intended for the symbolic manipulation of a large class of special functions and combinatorial sequences (in one or several variables and indices) that appear in many branches of mathematics, mathematical physics, and engineering sciences. Members of the class satisfy a crucial finiteness property which makes the class amenable to computer algebra methods and enjoy numerous algorithmic closure properties, including algorithmic closures under integration and summation.

Contact: Frédéric Chyzak

Keyword: Proof assistant

Scientific Description: Ssreflect is tactic language that helps writing concise and uniform tactic based proof scripts for the Coq system. It was designed during the proofs of the 4 Color Theorem and the Feit-Thompson theorem.

Functional Description: Ssreflect is a tactic language extension to the Coq system, developed by the Mathematical Components team.

News Of The Year: In 2019, we extended the intro pattern functionality of SSreflect and added support for working under binders using the "under" tactical.

Participants: Assia Mahboubi, Cyril Cohen, Enrico Tassi, Georges Gonthier, Laurence Rideau, Laurent Théry and Yves Bertot

Contact: Yves Bertot

*Mathematical Components library*

Keyword: Proof assistant

Functional Description: The Mathematical Components library is a set of Coq libraries that cover the prerequiste for the mechanization of the proof of the Odd Order Theorem.

Release Functional Description: This releases is compatible with Coq 8.9 and Coq 8.10 it adds many theorems for finite function, prime numbers, sequences, finite types, bigo operations, natural numbers, cycles in graphs.

Participants: Alexey Solovyev, Andrea Asperti, Assia Mahboubi, Cyril Cohen, Enrico Tassi, François Garillot, Georges Gonthier, Ioana Pasca, Jeremy Avigad, Laurence Rideau, Laurent Théry, Russell O'Connor, Sidi Ould Biha, Stéphane Le Roux and Yves Bertot

Contact: Assia Mahboubi

In 1994, Becker conjectured that if

A previous article described explicit expressions for the coefficients of the
order-*linear* arithmetic complexity, which is
faster than for arbitrary polynomials. The result is obtained as a
consequence of the amazing though seemingly unnoticed fact that these
subresultants are scalar multiples of Jacobi polynomials up to an affine
change of variables.

In , Alin Bostan together with Jordan Tirrell (Washington College, USA) Philadelphia, USA), Bruce W. Westbury (Unversity of Texas at Dallas, USA) and Yi Zhang (Xi'an Jiaotong-Liverpool University, Suzhou, China) studied two families of sequences, listed in the On-Line Encyclopedia of Integer Sequences (OEIS), which are associated to invariant theory of Lie algebras. For the first family, they proved combinatorially that the sequences A059710 and A108307 are related by a binomial transform. Based on this, they presented two independent proofs of a recurrence equation for A059710, which was conjectured by Mihailovs. Besides, they also gave a direct proof of Mihailovs’ conjecture by the method of algebraic residues. As a consequence, closed formulae for the generating function of sequence A059710 were obtained in terms of classical Gaussian hypergeometric functions.

If a linear differential operator with rational function coefficients is
reducible, its factors may have coefficients with numerators and denominators
of very high degree. When the base field is

Alin Bostan contributed to F. Chapoton's article by
writing an appendix, which allowed the author to complete its article. The
theme of is the study of simplicial complexes in
algebraic combinatorics. A basic invariant is the

Alin Bostan contributed to an article by Cédric Boutillier and
Kilian Raschel , devoted to the study of random walks on
isoradial graphs. Contrary to the lattice case, isoradial graphs are not
translation invariant, do not admit any group structure and are spatially
non-homogeneous. However, Boutillier and Raschel have been able to obtain
analogues of a celebrated result by Ney and Spitzer (1966) on the so-called
*Martin kernel* (ratio of Green functions started at different points).
Alin Bostan provided in the Appendix two different proofs of the fact that
some algebraic power series arising in this context have non-negative
coefficients.

In the second edition of the book ,
original methods were proposed to determine the invariant measure of random walks in the quarter plane with small jumps (size 1),
the general solution being obtained via reduction to boundary value problems.
Among other things, an important quantity, the so-called *group of the walk*,
allows to deduce theoretical features about the nature of the solutions. In particular,
when the order of the group is finite and the underlying algebraic curve is of genus 0 or 1,
necessary and sufficient conditions have been given for the solution to be rational, algebraic or *Jackson networks*)
and explicit solutions of functional equations for counting lattice walks,
see .

How many operations do we need on the average to compute an approximate root of
a random Gaussian polynomial system? Beyond Smale's 17th problem that asked
whether a polynomial bound is possible,
Pierre Lairez has proved in
a quasi-optimal bound *rigid continuation paths*. The central idea is to consider rigid motions of the equations rather than line segments in the linear space of all polynomial systems. This leads to a better average condition number and allows for bigger steps.
He showed that on the average,
one approximate root of a random Gaussian polynomial system of

In 2019, the article has been accepted in the Journal of the AMS.

Let

Their algorithm relies on the relationship between volumes of semi-algebraic sets and periods of rational integrals. It makes use of algorithms computing the Picard-Fuchs differential equation of appropriate periods, properties of critical points, and high-precision numerical integration of differential equations.

The algorithm runs in essentially linear time with respect to

A small subset of combinatorial sequences have coefficients that can be
represented as moments of a nonnegative measure on *Stieltjes moment sequences*. They have a number
of useful properties, such as log-convexity, which in turn enables one to
rigorously bound their growth constant from below.

They showed that the densities for

As a bonus, they studied the challenging case of the

*De rerum natura*.
This project, set up by the team, was accepted this year and will be funded until 2023.
It gathers over 20 experts from four fields:
computer algebra;
the Galois theories of linear functional equations;
number theory;
combinatorics and probability.
Our goal is to obtain classification algorithms for number theory and combinatorics,
particularly so for deciding irrationality and transcendence.

Alin Bostan together with Marc Mezzaroba (CNRS, Sorbonne Université) and Tanguy Rivoal (CNRS, Université Grenoble-Alpes) have done a “research in pairs” on the Fast Computation of Values of D-Finite Functions, from December 2 to 6, 2019, at CIRM (Luminy, France). The aim of the joint project was to investigate the implications of arithmetic properties of linear differential equations on the computational complexity of their numerical solutions. They focussed on E- and G-functions, which are power series solutions of differential equations that additionally satisfy strong arithmetic conditions and play a major role in Diophantine approximation. The main goal for this research session was to understand several remarks, given without proof by Chudnovsky and Chudnovsky in the late 1980s, and stating that number-theoretic properties could lead to slightly better complexity bounds for E- and G-functions than in the general case.

Pierre Lairez supervised during two months Abhijit Balachandra, M1-level student from the Indian Institute of Science (Bangalore). They studied some new aspects of the numerical computation of the topology of complex algebraic surfaces.

Alin Bostan is part of the Scientific advisory board of the conference
series *Effective Methods in Algebraic Geometry* (MEGA).

Alin Bostan is part of the scientific committee of the GDR EFI (“Functional Equations and Interactions”) dependent on the mathematical institute (INSMI) of the CNRS. The goal of this GDR is to bring together various research communities in France working on functional equations in fields of computer science and mathematics.

Frédéric Chyzak is member of the steering committee of the
*Journées Nationales de Calcul Formel* (JNCF),
the annual meeting of the French computer algebra community.

Frédéric Chyzak was until July 2019
elected member (and chair) of the steering committee
of the *International Symposium on Symbolic and Algebraic Computation*
(ISSAC, 3-year term).

Georges Gonthier is a member of the steering committee of
the *Certified Programs and Proofs* Conference (CPP).

Alin Bostan co-organizes, with Lucia Di Vizio, the
*Séminaire
Différentiel* between U. Versailles and Inria Saclay, with a bi-annual
frequency (

Alin Bostan co-organizes, with Lucia Di Vizio, the working group
*Marches
dans le quart de plan*, at Institut Henri Poincaré (Paris), with a bi-monthly
frequency (

Alin Bostan and Frédéric Chyzak have served as conference program committee members
for the first *Maple Conference*.

Georges Gonthier has served as a conference program committee members for the first
*Workshop on Formal Methods for Blockchains* (FMBC).

Frédéric Chyzak has served as reviewer for the selection of the international conferences CICM 2019, ISSAC 2019, and Maple Conference 2019.

Alin Bostan has served as reviewer for the selection of the international conferences FPSAC 2019 and Maple Conference 2019.

Alin Bostan is on the editorial board of the *Journal of Symbolic Computation*.

Alin Bostan is on the editorial board of the *Annals of Combinatorics*.

Guy Fayolle is associate editor of the journal *Markov Processes and Related Fields*.

Georges Gonthier is on the editorial board of the *Journal of Formalized Reasoning*.

Alin Bostan has served as a reviewer for the journals:
*Journal of Symbolic Computation*,
*Journal of Combinatorial Theory, Series A*,
*Applicable Algebra in Engineering Communications and Computing*,
*Journal of Combinatorial Algebra*,
*Annales Henri Lebesgue*,
*Annali dell Università di Ferrara*,
*Mathematics of Computation*,
*Séminaire Lotharingien de Combinatoire*.

Guy Fayolle has been a reviewer for *Advances in Applied Probability*, *Markov Processes and Related Fields*, *Probability Theory and Related Fields*, *Queueing Systems: Theory and Applications*, *European Journal of Combinatorics*, *Journal of Statistical Physics*, *Physica A*, *Springer Science*.

Frédéric Chyzak has been invited to give a talk on his joint work with Alin Bostan about the enumeration of walks with small steps in the quarter plane at the international conference Transient Transcendence in Transylvania (Brașov, Romania).

Frédéric Chyzak has been invited to give talks on his joint work with Philippe Dumas about Becker's conjecture on Mahler functions: at the conference Équations Fonctionnelles et Interactions (Anglet), during a Seminar on Symbolic Computation at the Academy of Mathematics and Systems Science, Chinese Academy of Sciences (Beijing, China), and at the conference Differential Galois Theory in Strasbourg (Strasbourg).

Frédéric Chyzak has been invited to give a talk on his joint work with Alin Bostan, Pierre Lairez, and Bruno Salvy (AriC) at the 6th Summer School in Symbolic Computation (Chongqing, China).

Alin Bostan has been invited to give a talk at the
*Algebraic Marvels in Differential Equations*,
Universidade Lisboa, Lisbonne, Portugal, February 2019.

Alin Bostan has been invited to give a talk at the Combinatorics Seminar, LaBRI, Bordeaux, March 2019.

Alin Bostan has been a plenary speaker at the international conference AofA 2019, Luminy (France), June 2019.

Alin Bostan has been a plenary speaker at the international conference FPSAC 2019, Ljubljana (Slovenia), July 2019.

Alin Bostan has been invited to give a series of five lectures at the Vienna Summer School of Mathematics, Weissensee, Austria, Sept. 2019.

The team organizes a regular seminar, with roughly 10 talks a year. The topics reflect the team's interests: computer algebra, combinatorics, number theory, formal proofs, and related domains.

In 2018, we have set up a working group
*Marches
dans le quart de plan* around the study of walks in the quarter plan, a very
active research topic in probability theory and enumerative combinatorics in
recent years. The working group is organized at Institut Henri Poincaré,
with a regularity of two sessions per month. The original purpose was to read
the article “On the Nature of the Generating Series of Walks in the Quarter
Plane” by T. Dreyfus, C. Hardouin, J. Roques, M. Singer, published in Invent.
Math. this year. But the reality exceeded expectations: the working group
attracted a dozen of people, working either in computer science or pure
mathematics, who began to interact and a very good dynamic was created.
Altogether, 15 sessions have taken place in 2019.

Together with Kilian Raschel (CNRS, U. Tours), Alin Bostan co-organized an international conference, Transient Transcendence in Transylvania, held in Romania from May 13 to 17, 2019. They took care together of all the infrastructure for this conference: program, invitations, web page, etc. This conference was a unique event in Romania, with a truly exceptional list of speakers, from several continents and countries: South Africa, Germany, Austria, Canada, United States, France, Netherlands, Poland, and of course, Romania. As a natural continuation of the conference, a volume will be published in the Springer collection PROMS (Proceedings in Mathematics & Statistics), with Bostan and Raschel as editors.

Guy Fayolle is scientific advisor and associate researcher at the *Robotics Laboratory of Mines ParisTech*.

Georges Gonthier is taking part in an interministerial survey on the technological roadblocks for blockchains, which has been jointly commissioned to Inria, CEA and IMT by the Ministère de l'Economie, the Ministère de l'Education supérieure et de la Recherche, and the Secrétariat d'Etat au Numérique. He also participates to the Blockchain Taskforce set up by the French government.

Frédéric Chyzak is project coordinator of the ANR project
*De rerum natura*.

Guy Fayolle is a member of the working group for *Computer System Modeling* of the *International Federation for Information Processing* (IFIP WG 7.3).

Georges Gonthier serves on the Conseil de l'École Doctorale de Mathématiques Hadamard.

**Master**:

Alin Bostan, *Algorithmes efficaces en calcul
formel*, 36h, M2, MPRI, France.

Alin Bostan, *Modern Algorithms for Symbolic
Summation and Integration*, 21h, M2, Master d'Informatique
Fondamentale de l'ENS de Lyon, France.

Frédéric Chyzak,
*Algorithmes efficaces en calcul formel*,
22.5h, M2, MPRI, France.

Pierre Lairez,
*Algorithmique avancée (INF550)*, TD,
18h, M2, École polytechnique, France.

Pierre Lairez,
*Les bases de la programmation et de l'algorithmique
(INF411)*, TD,
40h, M1, École polytechnique, France.

Alin Bostan has served as an examiner in the PhD jury of Robin Larrieu,
*Arithmétique rapide pour des corps finis*, Ecole polytechnique, December 10, 2019.

Alin Bostan has served as a member of the monitoring PhD committee of Youssef Abdelaziz, Univ. Paris 6.

Alin Bostan has served as a member of the monitoring PhD committee of Manon Bertin, Univ. Rouen.

Frédéric Chyzak has served as a reviewer in the PhD jury of Joelle Saade,
*Méthodes symboliques pour les systèmes différentiels linéaires à singularité irrégulière*, Université de Limoges, November 5, 2019.

Frédéric Chyzak has served as a reviewer in the PhD jury of Amélie Trotignon,
*Marches sur des réseaux dans des cônes : aspects combinatoires et probabilistes*, Université de Tours, December 6, 2019.

Pierre Lairez has served as a reviewer in the PhD jury of Josué Tonelli-Cueto, *Condition and Homology in Semialgebraic Geometry*, TU Berlin, November 28, 2019.

Georges Gonthier has served in the PhD jury of Armaël Guénaud,
*Mechanized Verification of the Correctness and Asymptotic Complexity of Programs*, Université de Paris, December 16, 2019.

Georges Gonthier published an interview article *Blockchain: ce que c'est, comment ça marche* in La Recherche, **545**, March 2019.

Georges Gonthier co-wrote with Ivan Odonnat (Banque de France) *L'avenir du bitcoin et de la blockchain*, in Les Carnets de l'Institut Diderot (2019).