En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK

Documents 49Q22 7 résultats

Filtrer
Sélectionner : Tous / Aucun
Q
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
It has been known for a long time that Hamilton-Jacobi-Bellman (HJB) equations preserve convexity, namely if the terminal condition is convex, the solution stays convex at all times. Equivalently, log-concavity is preserved along the heat equation, namely if one starts with a log-concave density, then the solution stays log-concave at all times. Both these facts are a direct consequence of Prékopa-Leindler inequality. In this talk, I will illustrate how a careful second-order analysis on coupling by reflection on the characteristics of the HJB equation reveals the existence of weaker notions of convexity that propagate backward along HJB. More precisely, by introducing the notion of integrated convexity profile, we are able to construct families of functions that fail to be convex, but are still invariant under the action of the HJB equation. In the second part of the talk I will illustrate some applications of these invariance results to the exponential convergence of learning algorithms for entropic optimal transport.[-]
It has been known for a long time that Hamilton-Jacobi-Bellman (HJB) equations preserve convexity, namely if the terminal condition is convex, the solution stays convex at all times. Equivalently, log-concavity is preserved along the heat equation, namely if one starts with a log-concave density, then the solution stays log-concave at all times. Both these facts are a direct consequence of Prékopa-Leindler inequality. In this talk, I will ...[+]

49Q22 ; 49L12

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
We consider the assignment (or bipartite matching) problem between $n$ source points and $n$ target points on the real line, where the assignment cost is a concave power of the distance, i.e. |x − y|p, for 0 < p < 1. It is known that, differently from the convex case (p > 1) where the solution is rigid, i.e. it does not depend on p, in the concave case it may varies with p and exhibit interesting long-range connections, making it more appropriate to model realistic situations, e.g. in economics and biology. In the random version of the problem, the points are samples of i.i.d. random variables, and one is interested in typical properties as the sample size n grows. Barthe and Bordenave in 2013 proved asymptotic upper and lower bounds in the range 0 < p < 1/2, which they conjectured to be sharp. Bobkov and Ledoux, in 2020, using optimal transport and Fourier-analytic tools, determined explicit upper bounds for the average assignment cost in the full range 0 < p < 1, naturally yielding to the conjecture that a “phase transition” occurs at p = 1/2. We settle affirmatively both conjectures. The novel mathematical tool that we develop, and may be of independent interest, is a formulation of Kantorovich problem based on Young integration theory, where the difference between two measures is replaced by the weak derivative of a function with finite q-variation.
Joint work with M. Goldman (arXiv:2305.09234).[-]
We consider the assignment (or bipartite matching) problem between $n$ source points and $n$ target points on the real line, where the assignment cost is a concave power of the distance, i.e. |x − y|p, for 0 < p 1) where the solution is rigid, i.e. it does not depend on p, in the concave case it may varies with p and exhibit interesting long-range connections, making it more appropriate to model realistic situations, e.g. in economics a...[+]

49Q22 ; 60D05 ; 60L99

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
We discuss the natural Lagrangian and Eulerian formulations of multi-agent deterministic optimal control problems, analyzing their relations with a novel Kantorovich formulation. We exhibit some equivalence results among the various representations and compare the respective value functions, by combining techniques and ideas from optimal transport, control theory, Young measures and evolution equations in Banach spaces. We further exploit the connections among Lagrangian and Eulerian descriptions to derive consistency results as the number of particles/agents tends to infinity. (In collaboration with Giulia Cavagnari, Stefano Lisini and Carlo Orrieri)[-]
We discuss the natural Lagrangian and Eulerian formulations of multi-agent deterministic optimal control problems, analyzing their relations with a novel Kantorovich formulation. We exhibit some equivalence results among the various representations and compare the respective value functions, by combining techniques and ideas from optimal transport, control theory, Young measures and evolution equations in Banach spaces. We further exploit the ...[+]

49N80 ; 49Q22

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y

Path constrained unbalanced optimal transport - Charon, Nicolas (Auteur de la Conférence) | CIRM H

Multi angle

We will present a variation of the unbalanced optimal transport model and Wasserstein Fisher-Rao metric on positive measures, in which one imposes additional affine integral equality constraints. This is motivated by multiple examples from mathematics and applied mathematics that naturally involve comparing and interpolating between two measures in particular subspaces or in which one enforces some constraints on the interpolating path itself. Building from the dynamic formulation of the Wasserstein Fisher-Rao metric, we introduce a class of constrained problems where the interpolating measure at each time must satisfy a given stationary or time-dependent constraint in measure space. We then specifically derive general conditions under which the existence of minimizing paths can be guaranteed, and then examine some of the properties of the resulting models and the metrics that are induced on measures. We will further hint at the potential of this approach in various specific situations such as the comparison of measures with prescribed moments, the unbalanced optimal transport under global mass evolution or obstacle constraints, and emphasize some connections with the construction of Riemannian metrics on the space of all convex shapes in an Euclidean space. We shall conclude with a few remaining unsolved/open questions.[-]
We will present a variation of the unbalanced optimal transport model and Wasserstein Fisher-Rao metric on positive measures, in which one imposes additional affine integral equality constraints. This is motivated by multiple examples from mathematics and applied mathematics that naturally involve comparing and interpolating between two measures in particular subspaces or in which one enforces some constraints on the interpolating path itself. ...[+]

49Q22

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Optimal transport, a mathematical theory which developed out of a problem raised by Gaspard Monge in the 18th century and of the reformulation that Leonid Kantorovich gave of it in the 20th century in connection with linear programming, is now a very lively branch of mathematics at the intersection of analysis, PDEs, probability, optimization and many applications, ranging from fluid mechanics to economics, from differential geometry to data sciences. In this short course we will have a very basic introduction to this field. The first lecture (2h) will be mainly devoted to the problem itself: given two distributions of mass, find the optimal displacement transforming the first one into the second (studying existence of such an optimal solution and its main properties). The second one (2h) will be devoted to the distance on mass distributions (probability measures) induced by the optimal cost, looking at topological questions (which is the induced topology?) as well as metric ones (which curves of measures are Lipschitz continuous for such a distance? what can we say about their speed, and about geodesic curves?) in connection with very natural PDEs such as the continuity equation deriving from mass conservation.[-]
Optimal transport, a mathematical theory which developed out of a problem raised by Gaspard Monge in the 18th century and of the reformulation that Leonid Kantorovich gave of it in the 20th century in connection with linear programming, is now a very lively branch of mathematics at the intersection of analysis, PDEs, probability, optimization and many applications, ranging from fluid mechanics to economics, from differential geometry to data ...[+]

49J45 ; 49Q22 ; 35-XX

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Optimal transport, a mathematical theory which developed out of a problem raised by Gaspard Monge in the 18th century and of the reformulation that Leonid Kantorovich gave of it in the 20th century in connection with linear programming, is now a very lively branch of mathematics at the intersection of analysis, PDEs, probability, optimization and many applications, ranging from fluid mechanics to economics, from differential geometry to data sciences. In this short course we will have a very basic introduction to this field. The first lecture (2h) will be mainly devoted to the problem itself: given two distributions of mass, find the optimal displacement transforming the first one into the second (studying existence of such an optimal solution and its main properties). The second one (2h) will be devoted to the distance on mass distributions (probability measures) induced by the optimal cost, looking at topological questions (which is the induced topology?) as well as metric ones (which curves of measures are Lipschitz continuous for such a distance? what can we say about their speed, and about geodesic curves?) in connection with very natural PDEs such as the continuity equation deriving from mass conservation.[-]
Optimal transport, a mathematical theory which developed out of a problem raised by Gaspard Monge in the 18th century and of the reformulation that Leonid Kantorovich gave of it in the 20th century in connection with linear programming, is now a very lively branch of mathematics at the intersection of analysis, PDEs, probability, optimization and many applications, ranging from fluid mechanics to economics, from differential geometry to data ...[+]

49J45 ; 49Q22 ; 35-XX

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
The Bayesian approach to inference is based on a coherent probabilistic framework that naturally leads to principled uncertainty quantification and prediction. Via posterior distributions, Bayesian nonparametric models make inference on parameters belonging to infinite-dimensional spaces, such as the space of probability distributions. The development of Bayesian nonparametrics has been triggered by the Dirichlet process, a nonparametric prior that allows one to learn the law of the observations through closed-form expressions. Still, its learning mechanism is often too simplistic and many generalizations have been proposed to increase its flexibility, a popular one being the class of normalized completely random measures. Here we investigate a simple yet fundamental matter: will a different prior actually guarantee a different learning outcome? To this end, we develop a new distance between completely random measures based on optimal transport, which provides an original framework for quantifying the similarity between posterior distributions (merging of opinions). Our findings provide neat and interpretable insights on the impact of popular Bayesian nonparametric priors, avoiding the usual restrictive assumptions on the data-generating process. This is joint work with Hugo Lavenant.[-]
The Bayesian approach to inference is based on a coherent probabilistic framework that naturally leads to principled uncertainty quantification and prediction. Via posterior distributions, Bayesian nonparametric models make inference on parameters belonging to infinite-dimensional spaces, such as the space of probability distributions. The development of Bayesian nonparametrics has been triggered by the Dirichlet process, a nonparametric prior ...[+]

60G55 ; 60G57 ; 49Q22 ; 62C10

Sélection Signaler une erreur