En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK

Documents Korba, Anna 3 résultats

Filtrer
Sélectionner : Tous / Aucun
Q
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Sampling is a fundamental task in Machine Learning. For instance in Bayesian Machine Learning, one has to sample from the posterior distribution over the parameters of a learning model, whose density is known up to a normalizing constant. In other settings such as generative modelling, one has to sample from a distribution from which some samples are available (e.g. images). The task of sampling can be seen as an optimization problem over the space of probability measures. The mathematical theory providing the tools and concepts for optimization over the space of probability measures is the theory of optimal transport. The topic of this course will be the connection between optimization and sampling, more precisely, how to solve sampling problems using optimization ideas. The goal of the first part of the course will be to present two important concepts from optimal transport: Wasserstein gradient flows and geodesic convexity. We will introduce them by analogy with their euclidean counterpart that is well known in optimization. The goal of the second part will be to show how these concepts, along with standard optimization techniques, enable to design, improve and analyze various sampling algorithms. In particular. we will focus on several interacting particles schemes that achieve state-of-the-art performance in machine learning.[-]
Sampling is a fundamental task in Machine Learning. For instance in Bayesian Machine Learning, one has to sample from the posterior distribution over the parameters of a learning model, whose density is known up to a normalizing constant. In other settings such as generative modelling, one has to sample from a distribution from which some samples are available (e.g. images). The task of sampling can be seen as an optimization problem over the ...[+]

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Sampling is a fundamental task in Machine Learning. For instance in Bayesian Machine Learning, one has to sample from the posterior distribution over the parameters of a learning model, whose density is known up to a normalizing constant. In other settings such as generative modelling, one has to sample from a distribution from which some samples are available (e.g. images). The task of sampling can be seen as an optimization problem over the space of probability measures. The mathematical theory providing the tools and concepts for optimization over the space of probability measures is the theory of optimal transport. The topic of this course will be the connection between optimization and sampling, more precisely, how to solve sampling problems using optimization ideas. The goal of the first part of the course will be to present two important concepts from optimal transport: Wasserstein gradient flows and geodesic convexity. We will introduce them by analogy with their euclidean counterpart that is well known in optimization. The goal of the second part will be to show how these concepts, along with standard optimization techniques, enable to design, improve and analyze various sampling algorithms. In particular. we will focus on several interacting particles schemes that achieve state-of-the-art performance in machine learning.[-]
Sampling is a fundamental task in Machine Learning. For instance in Bayesian Machine Learning, one has to sample from the posterior distribution over the parameters of a learning model, whose density is known up to a normalizing constant. In other settings such as generative modelling, one has to sample from a distribution from which some samples are available (e.g. images). The task of sampling can be seen as an optimization problem over the ...[+]

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Sampling is a fundamental task in Machine Learning. For instance in Bayesian Machine Learning, one has to sample from the posterior distribution over the parameters of a learning model, whose density is known up to a normalizing constant. In other settings such as generative modelling, one has to sample from a distribution from which some samples are available (e.g. images). The task of sampling can be seen as an optimization problem over the space of probability measures. The mathematical theory providing the tools and concepts for optimization over the space of probability measures is the theory of optimal transport. The topic of this course will be the connection between optimization and sampling, more precisely, how to solve sampling problems using optimization ideas. The goal of the first part of the course will be to present two important concepts from optimal transport: Wasserstein gradient flows and geodesic convexity. We will introduce them by analogy with their euclidean counterpart that is well known in optimization. The goal of the second part will be to show how these concepts, along with standard optimization techniques, enable to design, improve and analyze various sampling algorithms. In particular. we will focus on several interacting particles schemes that achieve state-of-the-art performance in machine learning.[-]
Sampling is a fundamental task in Machine Learning. For instance in Bayesian Machine Learning, one has to sample from the posterior distribution over the parameters of a learning model, whose density is known up to a normalizing constant. In other settings such as generative modelling, one has to sample from a distribution from which some samples are available (e.g. images). The task of sampling can be seen as an optimization problem over the ...[+]

Sélection Signaler une erreur