During the last 20 years, imaging sciences, including inverse problems, segmentation or classification, has known two major revolutions: (i) sparsity and proximal algorithms and (ii) deep learning and stochastic optimization. This course proposes to illustrate these major advances in the context of imaging problems that can be formulated as the minimization of an objective function and to highlight the evolution of these objective functions jointly with optimization advances.
Since 2003, convex optimization has become the main thrust behind significant advances in signal processing, image processing and machine learning. The increasingly complex variational formulations encountered in these areas which may involve a sum of several, possibly non-smooth, convex terms, together with the large sizes of the problems at hand make the use of standard optimization methods such as those based on subgradient descent techniques intractable computationally. Since their introduction in the signal processing arena, splitting techniques have emerged as a central tool to circumvent these roadblocks: they operate by breaking down the problem into individual components that can be activated individually in the solution algorithm. In the past decade, numerous convex optimization algorithms based on splitting techniques have been proposed or rediscovered in an attempt to efficiently deal with such problems. We will provide the basic building blocks for major proximal algorithm strategies and their recent advances in nonconvex and stochastic optimization. Behind non-smooth functions, there is the concept of sparsity which is central in the contributions in inverse problems and compressed sensing. This concept will be described as well as the objective functions relying on it, going from Mumford-Shah model to sparse SVM. Ten years after the start of proximal revolution, deep learning has started to provide a new framework for solving imaging problems going from agnostic techniques to models combining deep learning with standard regularized formulation. The main encountered objective functions as well as the associated algorithmic strategies will be discussed.
1/ Introduction
2/ Optimization: basics
3/ Subdifferential and proximity operator
4/ First order schemes (gradient descent, proximal point algorithm, forward-backward splitting, Peaceman-Rachford splitting, Douglas-Rachford splitting): weak and linear convergence.
5/ Conjugate, duality, proximal primal-dual algorithms
6/ Unfolded algorithms
7/ Acceleration, non-convex optimization, stochastic optimization
[-]
During the last 20 years, imaging sciences, including inverse problems, segmentation or classification, has known two major revolutions: (i) sparsity and proximal algorithms and (ii) deep learning and stochastic optimization. This course proposes to illustrate these major advances in the context of imaging problems that can be formulated as the minimization of an objective function and to highlight the evolution of these objective functions ...
[+]