## Section: New Results

### Auxiliary Variable Method for MCMC Algorithms in High Dimension

**Participants:** Emilie Chouzenoux and Jean-Christophe Pesquet (in collaboration with Y. Marnissi, SAFRAN TECH and A. Benazza-Benhayia, SUP'COM, COSIM, Tunis)

When the parameter space is high dimensional, the performance of stochastic sampling algorithms is very sensitive to existing dependencies between parameters. For instance, this problem arises when one aims to sample from a high dimensional Gaussian distribution whose covariance matrix does not present a simple structure. Then, one often resorts to sampling algorithms based on a perturbation-optimization technique that requires to minimize a cost function using an iterative algorithm. This makes the sampling process time consuming, especially when used within a Gibbs sampler. Another challenge is the design of Metropolis-Hastings proposals that make use of information about the local geometry of the target density in order to speed up the convergence and improve mixing properties in the parameter space, while being not too computationally expensive. These two contexts are mainly related to the presence of two heterogeneous sources of dependencies stemming either from the prior or the likelihood in the sense that the related covariances matrices cannot be diagonalized in the same basis. In paper [34], we are interested in inverse problems where either the data fidelity term or the prior distribution is Gaussian or driven from a hierarchical Gaussian model. We propose to add auxiliary variables to the model in order to dissociate the two sources of dependencies. In the new augmented space, only one source of correlation remains directly related to the target parameters, the other sources of correlations being captured by the auxiliary variables. Experiments conducted on two image restoration problems show the good performance of the proposed strategy.