Section: New Results
Monitoring network design
Design of a monitoring network over France in case of a radiological accidental release
Participants : Marc Bocquet, Olivier Saunier [ École des Ponts ParisTech/IRSN ] .
Launched in March 2006, the network design activity aims at developing new methodologies and applying them to the optimal design of monitoring network for air pollution. Our efforts are dedicated on one hand to the design of atmospheric accidental surveillance networks, and on the other hand to the design of air quality (ozone for instance) monitoring networks. This activity has been supported by the IRSN and Région Île-de-France (R2DS research network). It has been generating discussions with INERIS, ADEME and AIRPARIF.
The Institute of Radiation Protection and Nuclear Safety (France) is planning the setup of an automatic nuclear aerosol monitoring network over the French territory (Descartes network), which complements the Teleray network. Each of the stations will be able to automatically sample the air aerosol content and to provide with activity concentration measurements on several radionuclides. This should help monitor the French and neighbouring countries nuclear power plant park. It would help evaluate the impact of a radiological incident on this park.
After the completion of the first phase (2006 and 2007), the second stage of
the study started in March 2008. The resolution has increased from
0.36 0.36
to 0.25
0.25
,
which doubles the number of potential sites, and
hence the complexity of the optimisation problem. Meteorological fields have been
generated with MM5 model. New considerations
have been taken into
account: the inclusion of foreign nuclear power plants, the validation of the
optimal network on new cost functions that have not been considered yet, or
taking into account the population density as a weighting
factor. Moreover, because the
Descartes network might be deployed sequentially, we have also considered
sub-optimal network design algorithms.
The computational time which was an important issue in the first stage is now a decisive issue because of the resolution increase. In order to accelerate the optimisations, we have developed new reduction techniques for network design optimisation. They are based on the reduction of the database of accidents using ideas derived from principal component analysis. These methods were proven to be very efficient on test cases. They were successfully applied to those new questions that were risen in phase 2 of the Descartes project.
Reduction of an air quality monitoring network over France
Participants : Lin Wu, Marc Bocquet.
Ozone is an important air pollutant and observational networks are constructed for its estimation at the ground level. Due to the heterogeneous nature of the ozone field, the way ozone is observed does matter in the estimation of the concentrations. The evaluation of the network is thus of both theoretical and practical interests. In this study, we assess the efficiency of the BDQA (Base de Donnée sur la Qualité de l'Air) network, by investigating a network reduction problem. We examine how well a subset of this network can represent the full network. The performance of a subnetwork is taken to be the root mean square error of the spatial estimations of ozone concentrations over the whole network based on the observations from that subnetwork. Spatial interpolations are conducted for the ozone estimation taking into account the spatial correlations. Several interpolation methods, namely ordinary kriging, simple kriging about means, kriging with means as external drifts, are compared for a reliable estimation. It is found that the statistical information about the means improves significantly the kriging results. We employ a translated exponential model for the spatial correlations. We show that it is necessary to consider the correlation model to be hourly-varying but daily stationary. The network reduction problem is solved using the simulated annealing algorithm. We obtain considerable improvements for the subnetworks with different sizes. In particular, we have shown that keeping only half of the stations allows to reconstruct the hourly values on the missing stations with an average error inferior to the observational error (see Figure 6 ).
|
Targeting of observations in case of a nuclear accidental release
Participants : Rachid Abida, Marc Bocquet.
In the event of an accidental atmospheric release from a nuclear power plant, high resolution and accurate information on the spread of the radioactive plume around the accident site constitute major key points, acutely required by decision makers in order to evaluate early countermeasure actions and consequences. Therefore, deploying mobile measuring devices constitutes an adequate monitoring strategy that allows to follow the real-time evolution of the radioactive plume. In fact, the collected measurements from the mobile network could be assimilated conjointly with data derived from the fixed monitoring network, so as to enhance knowledge on the state of the radioactive cloud. The targeting design consists in seeking the optimal spatial locations of the mobile stations at a certain time that satisfy some design criterion based on all available previous information. To illustrate how much a targeting strategy could improve the available information on the state of the radioactive plume, we considered an hypothetical accident release occurring at the Bugey power plant and a sequential data assimilation scheme based on inverse modeling to reconstruct the accident event. This assimilation scheme was coupled with a targeting strategy. The existing surveillance network is used and realistic observational errors are assumed. The targeting scheme leads to a better estimation of the source term as well as the activity concentrations in the domain. The mobile stations tend to be deployed along plume contours, where activity concentration gradients are important. It is shown that the information carried by the targeted observations is very significant, as compared to the information content of fixed observations. A simple test on the impact of model error from meteorology shows that the targeting strategy is still very useful in a more uncertain context.