Section: Research Program
Nonsmooth optimization
optimization, numerical algorithm, convexity, Lagrangian relaxation, combinatorial optimization.
Here we are dealing with the minimization of a function $f$ (say over the whole space ${\mathrm{R}}^{n}$), whose derivatives are discontinuous. A typical situation is when $f$ comes from dualization, if the primal problem is not strictly convex – for example a largescale linear program – or even nonconvex – for example a combinatorial optimization problem. Also important is the case of spectral functions, where $f\left(x\right)=F\left(\lambda \right(A\left(x\right)\left)\right)$, $A$ being a symmetric matrix and $\lambda $ its spectrum.
For these types of problems, we are mainly interested in developing efficient resolution algorithms. Our basic tool is bundling (Chap. XV of [11] ) and we act along two directions:

To explore application areas where nonsmooth optimization algorithms can be applied, possibly after some tayloring. A rich field of such application is combinatorial optimization, with all forms of relaxation [12] .

To explore the possibility of designing more sophisticated algorithms. This implies an appropriate generalization of second derivatives when the first derivative does not exist, and we use advanced tools of nonsmooth analysis, for example [14] .