Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
It has been known for a long time that Hamilton-Jacobi-Bellman (HJB) equations preserve convexity, namely if the terminal condition is convex, the solution stays convex at all times. Equivalently, log-concavity is preserved along the heat equation, namely if one starts with a log-concave density, then the solution stays log-concave at all times. Both these facts are a direct consequence of Prékopa-Leindler inequality. In this talk, I will illustrate how a careful second-order analysis on coupling by reflection on the characteristics of the HJB equation reveals the existence of weaker notions of convexity that propagate backward along HJB. More precisely, by introducing the notion of integrated convexity profile, we are able to construct families of functions that fail to be convex, but are still invariant under the action of the HJB equation. In the second part of the talk I will illustrate some applications of these invariance results to the exponential convergence of learning algorithms for entropic optimal transport.
[-]
It has been known for a long time that Hamilton-Jacobi-Bellman (HJB) equations preserve convexity, namely if the terminal condition is convex, the solution stays convex at all times. Equivalently, log-concavity is preserved along the heat equation, namely if one starts with a log-concave density, then the solution stays log-concave at all times. Both these facts are a direct consequence of Prékopa-Leindler inequality. In this talk, I will ...
[+]
49Q22 ; 49L12