## Section: New Results

### Assessments of models by means of experimental data and assimilation

#### Metamodeling corrected by observational data

Members: V. Mallet, J. Hammond

An air quality model at urban scale computes the air pollutant concentrations at street resolution based on various emissions, meteorology, imported pollution and city geometry. Because of the computational cost of such model, we previously designed a metamodel using dimension reduction and statistical emulation, and then corrected this metamodel with observational data. Novel work was dedicated to the error modeling for a more balanced integration of the observations. The work was also applied to air quality simulation over Paris using several months of data.

#### Metamodeling of a complete air quality simulation chain

Members: A. Lesieur, V. Mallet

*Coll.: Ruiwei Chen*

With the objective of uncertainty quantification, we worked on the generation of a metamodel for the simulation of urban air quality, using a complete simulation chain including dynamic traffic assignment, the computation of air pollutant emissions and the dispersion of the pollutant in a city. The traffic model and the dispersion model are computationally costly and operate in high dimension. We employed dimension reduction, and coupled it with Kriging in order to build a metamodel for the complete simulation chain.

#### Artificial neural networks for the modeling of air pollution

Member: V. Mallet

Air quality simulations at national, continental or global scales are subject to large uncertainties which are typically mitigated by data assimilation techniques. Another approach to improve the forecasts is to design an error model, learning from historical discrepancies between simulations and observations. Such a model was built using an artificial neural network trained with many meteorological and geographical data. Further studies showed that the technique could successfully generate not only an error model (to improve pre-existing simulations), but also a complete model (without the need for pre-existing simulations) whose forecasts are more accurate than those of traditional models.

#### Uncertainty quantification in atmospheric dispersion of radionuclides

Members: V. Mallet,

*Coll.: Irène Korsakissok*

In collaboration with IRSN (Institute of Radiation Protection and Nuclear Safety), we investigated the uncertainties of the atmospheric-dispersion forecasts that are used during an accidental release of radionuclides such as the Fukushima disaster. These forecasts are subject to considerable uncertainties which originate from inaccurate weather forecasts, poorly known source term and modeling shortcomings. In order to quantify the uncertainties, we designed a metamodel and carried out the calibration of the metamodel input distributions using Markov chain Monte Carlo.

#### Meta-modeling for urban noise mapping

Members: A. Lesieur, V. Mallet

*Coll.: Pierre Aumond, Arnaud Can*

Noise computing software can require several hours to produce a map over an urban center for a given set of input data. This computational cost makes the models unsuitable for applications like uncertainty quantification or data assimilation where thousands of simulations, or more, can be required. One solution is to replace the physical model with a meta-model which is very fast and yet fairly reproduces the results of the physical model. The strategy is first to reduce the dimension of both inputs and outputs of the physical model, which leads to a reduced model. This reduced model is then replaced by a statistical emulator. The emulator is trained with calls to the reduced model for a set of chosen inputs. The emulator relies on the interpolation between the training output values.

#### Data assimilation for urban noise maps generated with a meta-model

Members: A. Lesieur, V. Mallet

*Coll.: Pierre Aumond, Arnaud Can*

In an urban area, it is increasingly common to have access to both a simulated noise map and a sensor network. A data assimilation algorithm is developed to combine data from both a noise map simulator and a network of acoustic sensors. One-hour noise maps are generated with a meta-model fed with hourly traffic and weather data. The data assimilation algorithm merges the simulated map with the sound level measurements into an improved noise map. The performance of this method relies on the accuracy of the meta-model, the input parameters selection and the model of the error covariance that describes how the errors of the simulated sound levels are correlated in space. The performance of the data assimilation is obtained with a leave-one-out cross-validation method.

#### Uncertainty quantification in wildland fire propagation

Members: F. Allaire, V. Mallet

*Coll.: Jean-Baptiste Filippi*

We worked further on the Monte Carlo simulation of wildland fires. We calibrated the input distributions that represent the uncertainties in the inputs of our fire spread predictions by using the observations of the final contours for a number of fire cases. We used a new metric to measure the dissimilarity between two burned sufaces that relies on the Wasserstein distance. We designed a metamodel and carried out the calibration of the model input distributions using Markov chain Monte Carlo.

#### A non-intrusive reduced order data assimilation method applied to the monitoring of urban flows

Member: J. Hammond

*Coll.: R. Chakir*

In [13], we investigate a variational data assimilation method to rapidly estimate urban pollutant concentration around an area of interest using measurement data and CFD based models in a non-intrusive and computationally efficient manner. In case studies presented here, we used a sample of solutions from a dispersion model with varying meteorological conditions and pollution emissions to build a Reduced Basis approximation space and combine it with concentration observations. The method allows to correct for unmodeled physics, while significantly reducing online computational time.