Section: New Results
Quantifying Uncertainty
Sensitivity analysis for West African monsoon
Participants : Anestis Antoniadis, Clémentine Prieur, Laurence Viry.
Geophysical context
The West African monsoon is the major atmospheric phenomenon which drives the rainfall regime in Western Africa . Therefore, this is the main phenomenon in water resources over the African continent from the equatorial zone to the subSaharian one. Obviously, it has a major impact on agricultural activities and thus on the population itself. The causes of interannual spatiotemporal variability of monsoon rainfall have not yet been univocally determined. Spatiotemporal changes on the see surface temperature (SST) within the Guinea Gulf and Saharian and SubSaharian Albedo are identified by a considerable body of evidences as major factors to explain it.
The aim of this study is to simulate the rainfall by a regional atmospheric model (RAM) and to analyze its sensitivity to the variability of these inputs parameters. Once precipitations from RAM are compared to several precipitation data sets we can observe that the RAM simulates the West African monsoon reasonably.
Statistical methodology
As mentioned in the previous paragraph, our main goal is to perform a sensitivity analysis for the West African monsoon. Each simulation of the regional atmospheric model (RAM) is time consuming, and we first have to think about a simplified model. We deal her with spatiotemporal dynamics, for which we have to develop functional efficient statistical tools. In our context indeed, both inputs (albedo, SST) and outputs (precipitations) are considered as time and space indexed stochastic processes. Conditionally to the space coordinates, we will perform a functional PCA (principal component analysis), that is we will consider a KarhunenLove decomposition. The KarhunenLove representation of a stochastic process is based on the spectral decomposition of its covariance function. Such a representation requires solving an eigenvalue problem in order to determine the eigenfunctions and eigenvalues of the covariance function. When this problem can not be solved analytically, the eigenfunctions are approximated numerically. Other orthogonal bases can be considered. The orthogonality warrants uniqueness of the decomposition. The regression between inputs and outputs can then be approximated by a linear functional regression, extending existing results in [50] . The spatial dependence observed on the data will be transported on the coefficients of the decompositions. In some cases, an additional spatial smoothing as far as local stationarity assumptions will allow to aggregate information on nearby locations.
Distributed Interactive Engineering Toolbox
The study described above still requires rather huge computation resources, and will be run in a grid computing environment which takes into account the scheduling of a huge number of computation requests and links with datamanagement, all of this as automatically as possible.
These works involve also partners from the INRIA project/team GRAAL for the computational approach, and from the Laboratory of Glaciology and Geophysical Environment (LGGE) for the use and interpretation of the regional atmospheric model (RAM).
Sensitivity analysis for forecasting ocean models
Participants : Éric Blayo, Maëlle Nodet, Clémentine Prieur, Alexandre Janon, JeanYves Tissot.
Scientific context
Forecasting ocean systems require complex models, which sometimes need to be coupled, and which make use of data assimilation. The objective of this project is, for a given output of such a system, to identify the most influential parameters, and to evaluate the effect of uncertainty in input parameters on model output. Existing stochastic tools are not well suited for high dimension problems (in particular timedependent problems), while deterministic tools are fully applicable but only provide limited information. So the challenge is to gather expertise on one hand on numerical approximation and control of Partial Differential Equations, and on the other hand on stochastic methods for sensitivity analysis, in order to develop and design innovative stochastic solutions to study high dimension models and to propose new hybrid approaches combining the stochastic and deterministic methods.
Preliminary results
A first task was to review the literature on both deterministic and stochastic methods for sensitivity analysis, in order to clarify the main advantages and drawbacks of each tool. This task has already been initiated by JeanYves Tissot during his internship (4 months during spring 2009). JeanYves has also implemented various methods on a linearized onedimensional Bürgers model. On this very simple model, JeanYves could confirm (or not in some cases) the conclusions of the literature review. However, other implementations have to be conducted, first on this model, but also on more general models such as ShallowWater models or in a much more complex system issued from the NEMO ocean model.
Perspectives
An approach we would like to focus on is also model reduction. To be more precise concerning model reduction, the aim is to reduce the number of unknown variables (to be computed by the model), using a well chosen basis. Instead of discretizing the model over a huge grid (with millions of points), the state vector of the model is projected on the subspace spanned by this basis (of a far lesser dimension). The choice of the basis is of course crucial and imply the success or failure of the reduced model. Various model reduction methods offer various choices of basis functions. A wellknown method is called “proper orthogonal decomposition" or “principal component analysis". More recent and sophisticated methods also exist and may be studied, depending on the needs raised by the theoretical study. Model reduction is a natural way to overcome difficulties due to huge computational times due to discretizations on fine grids. A PhD student, Alexandre Janon, is actually working in this direction, first on very simple one dimensional Shallow Water models. The generalization to models issued from NEMO is an interesting but complex perspective, which should be first conducted by implementations.
Propagation of uncertainties
Participants : FrançoisXavier Le Dimet, Victor Shutyaev.
Basicaly geophysical models are suffering of two types of errors:

errors in the model itself due to approximations of physical processes and their subgrid parametrization and also errors linked to the necessary numerical discretization.

Errors in the observation because of the measurements and also errors due to sampling. For instance, many remote sensings observe only radiances, which are transformed into the state variables thanks to complex processes like the resolution of an inverse problem. This is, of course, a source of errors.
Estimating the propagation of errors is an important and costly (in term of computing resources) task for two reasons:

the quality of the forecast must be estimated

the estimation of the statistics of errors has to be included in the analysis to have an adequate norm, based on these statistics, on the forecast and also on the observation.
In the variational framework, models, observations, statistics are linked into the optimality system which can be considered as a “generalized" model containing all the available estimation. In works [49] , [45] , [44] the estimation of error covariances are estimated both from the second order analysis and the Hessian of the cost function. Numerical experiments have been carried out on a nonlinear 1D model, we expect to extent the numerical experiments to a semioperational model in cooperation with ECMWF.