Team ASPI

Members
Overall Objectives
Scientific Foundations
Application Domains
Software
New Results
Contracts and Grants with Industry
Dissemination
Bibliography

Section: Scientific Foundations

Particle approximations of Feynman–Kac distributions

The following abstract point of view, developed and extensively studied by Pierre Del Moral  [37] , [35] , has proved to be extremely fruitful in providing a very general framework to the design and analysis of numerical approximation schemes, based on systems of branching and / or interacting particles, for nonlinear dynamical systems with values in the space of probability distributions, associated with Feynman–Kac flows. Feynman–Kac distributions are characterized by a Markov chain and by nonnegative potential functions that play the role of selection functions. They naturally arise whenever importance sampling is used: this applies for instance to simulation of rare events, to filtering, i.e. to state estimation in hidden Markov models (HMM), etc. To solve numerically the recurrent equation satisfied by the Feynman–Kac distributions, and in view of the basic assumption that it is easy to simulate r.v.'s according to the Markov transition kernel, i.e. to mimic the evolution of the Markov chain, and that it is easy to evaluate the potential functions, the original idea behind particle methods consists of looking for an approximation in the form of a (possibly weighted) empirical probability distribution associated with a system of particles. The approximation is completely characterized by the set of particle positions and weights, and the algorithm is completely described by the mechanism which builds this set recursively. In practice, in the simplest version of the algorithm, known as the bootstrap algorithm, particles

The algorithm yields a numerical approximation of the Feynman–Kac distribution as the weighted empirical probability distribution associated with a system of particles, and many asymptotic results have been proved as the number N of particles (sample size) goes to infinity, using techniques coming from applied probability (interacting particle systems, empirical processes  [73] ), see e.g. the survey article  [37] or the recent textbook  [35] , and references therein

convergence in Lp , convergence as empirical processes indexed by classes of functions, uniform convergence in time, see also [8] , [9] , central limit theorem, see also  [58] , [63] , propagation of chaos, large deviations principle, moderate deviations principle  [38] , etc.

Beyond the simplest bootstrap version of the algorithm, many algorithmic variations have been proposed  [40] , [31] , and are commonly used in practice. For instance (i) in the selection step, sampling with replacement could be replaced with other redistribution schemes so as to reduce the variance (this issue has also been addressed in genetic algorithms), and (ii) to reduce the variance and to save computational effort, it is often a good idea not to redistribute the particles at each time step, but only when the weights are too far from equidistribution. Even with interacting Monte Carlo methods, it could happen that some particles generated in one time step have a negligible weight: if this happens for too many particles in the sample, then computing power has been wasted, and it has been suggested to use importance sampling again in the mutation step, i.e. to let particles explore the state space under the action of an alternate wrong mutation kernel, and to weight the particles according to their likelihood for the true model, so as to compensate for the wrong modeling. More specifically, using an arbitrary importance decomposition results in the following general algorithm, known as the sampling with importance resampling (SIR) algorithm, in which particles

Many of the early convergence results proved in the literature assume that particles are redistributed (i) using sampling with replacement and (ii) at each time step, and move according to the original Markov transition kernel. Systematically studying the impact of the proposed algorithmic variants on the convergence results is still the subject of active research.


previous
next

Logo Inria