PDF e-Pub

Section: Overall Objectives

Sparsity in Imaging

Sparse ${\ell }^{1}$ regularization.

The huge interest in these ideas started mostly from the introduction of convex methods to serve as proxy for these sparse regularizations. The most well known is the ${\ell }^{1}$ norm introduced independently in imaging by Donoho and co-workers under the name “Basis Pursuit”  [103] and in statistics by Tibshirani  [170] under the name “Lasso”. A more recent resurgence of this interest dates back to 10 years ago with the introduction of the so-called “compressed sensing” acquisition techniques  [85], which make use of randomized forward operators and ${\ell }^{1}$-type reconstruction.

Regularization over measure spaces.

However, the theoretical analysis of sparse reconstructions involving real-life acquisition operators (such as those found in seismic imaging, neuro-imaging, astro-physical imaging, etc.) is still mostly an open problem. A recent research direction, triggered by a paper of Candès and Fernandez-Granda  [87], is to study directly the infinite dimensional problem of reconstruction of sparse measures (i.e. sum of Dirac masses) using the total variation of measures (not to be mistaken for the total variation of 2-D functions). Several works  [86], [115], [112] have used this framework to provide theoretical performance guarantees by basically studying how the distance between neighboring spikes impacts noise stability.

Figure 3. Two example of application of the total variation regularization of functions. Left: image segmentation into homogeneous color regions. Right: image zooming (increasing the number of pixels while keeping the edges sharp).
 Segmentation input output Zooming input output

Low complexity regularization and partial smoothness.

In image processing, one of the most popular methods is the total variation regularization  [163], [79]. It favors low-complexity images that are piecewise constant, see Figure 3 for some examples on how to solve some image processing problems. Beside applications in image processing, sparsity-related ideas also had a deep impact in statistics  [170] and machine learning  [43]. As a typical example, for applications to recommendation systems, it makes sense to consider sparsity of the singular values of matrices, which can be relaxed using the so-called nuclear norm (a.k.a. trace norm)  [44]. The underlying methodology is to make use of low-complexity regularization models, which turns out to be equivalent to the use of partly-smooth regularization functionals  [136], [172] enforcing the solution to belong to a low-dimensional manifold.