## Section: New Results

### Homogenization

Participants : Arnaud Anantharaman, Sébastien Boyaval, Ronan Costaouec, Claude Le Bris, Frédéric Legoll, Florian Thomines.

In collaboration with X. Blanc (Paris 6), C. Le Bris has studied the applicabilition of ideas based on filtering to homogenization of elliptic partial differential equations. The bottom line is to modify the corrector problem by introducing a filtering function, in order to improve the efficiency of the method. Some popular methods, such as the oversampling method, can indeed be considered as special instances of such a general strategy. Encouraging numerical results, supported by a rigorous theoretical analysis, have been reported in the work [11] , in the case of periodic and quasi-periodic settings.

The project-team has also pursued its efforts in the field of
stochastic homogenization of elliptic equations.
An interesting case in that context is when
the randomness comes as a *small* perturbation of the deterministic
case. This situation can indeed be handled with a dedicated approach,
which turns out to be far more efficient than the standard approach of
stochastic homogenization, as explained
in [47] .

This case has been studied by C. Le Bris, in collaboration with P.-L. Lions (Collège de France) and X. Blanc (Paris 6). The analysis naturally gives rise to a numerical strategy, which has been studied and implemented by R. Costaouec, C. Le Bris and F. Legoll [25] .

In the work mentioned above, the perturbation to the deterministic case
is supposed to be small in the norm (that is, it is almost
surely small). In [9] , A. Anantharaman and
C. Le Bris have extended this study to the case when the perturbation is
small in a weaker norm (the
case when only the *expectation* of the perturbation is assumed to
be small, rather than the perturbation itself, is covered by that framework).
The approach proves to be very efficient from a computational
viewpoint. It is rigorously founded in a certain class of settings
and has been successfully numerically tested for more general
settings. A. Anantharaman and C. Le Bris, in collaboration with E. Cancès, have
started to address the theoretical issues related to these general
settings.

The team has also addressed, from a numerical viewpoint, the case when
the randomness is not small. In that case, using the standard
homogenization theory, one knows that the homogenized tensor, which a
deterministic matrix, depends on the solution of a stochastic equation,
the so-called corrector problem, which is posed on the *whole* space
. This equation is therefore delicate and expensive to solve. In
practice, the space is truncated to some bounded domain, on
which the corrector problem is numerically solved. In turn, this yields a converging
approximation of the homogenized tensor, which happens to be a *random* matrix. For a given truncation of , R. Costaouec, C. Le
Bris and F. Legoll, in collaboration with X. Blanc (Paris 6), have
studied how to reduce the variance. Several strategies
have been tested. Definite conclusions on the efficiency of the methods,
as well as their range of applicability, are yet to be
obtained. Nonetheless, very encouraging numerical results have already
been obtained.

From a numerical perspective, the Multiscale Finite Element Method is a classical strategy to address the situation when the homogenized problem is not known (e.g. in nonlinear cases), or when the scale of the heterogeneities, although small, is not considered to be zero (and hence the homogenized problem cannot be considered as an accurate enough model). The extension of this strategy to the stochastic case is currently studied by F. Thomines, as the first stage of his PhD thesis, with X. Blanc (Paris 6), C. Le Bris and F. Legoll.

Furthermore, still in the context of elliptic homogenization, S. Boyaval and C. Le Bris have studied the applicability of reduced-basis ideas to variational problems with stochastic parameters, in collaboration with Y. Maday (CNRS/UPMC/Brown), N.C. Nguyen and A.T. Patera (MIT). The motivation stems from the need to take into account many different random microstructures in the context of stochastic homogenization. One of the bottlenecks is that the solutions to a given partial differential equation for different stochastic parameters form a high-dimensional space. To address this difficulty, different approaches have been recently suggested in the literature on uncertainty quantification for stochastic partial differential equations. The combination of these approaches with the reduced-basis method has been tested and analyzed for a scalar (linear) elliptic problem with stochastic boundary conditions [15] . A state-of-the-art review article [14] on reduced basis techniques applied to stochastic problems has also been written.

In the context of parabolic homogenization, A. Anantharaman has pursued the study of boundary layers in time (close to the initial time t = 0 ) and space (close to the domain boundaries), in collaboration with G. Allaire (CMAP) and E. Cancès. The idea is to add space, time and space-time boundary layer terms to the usual approximate solution (which is computed by solving the homogenized problem and the corrector problems), so that the difference between the exact solution and the approximate solution can be estimated, and more precisely controlled in interesting functional spaces. The main difficulty that has yet to be overcome, is that the classical space boundary layers (usually defined in the stationary case) and the classical time boundary layers (usually defined in an infinite domain) are not compatible one with one another. Nonetheless, progress has been made in the understanding of this issue through the use of Bloch waves.