Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Deep generative models parametrize very flexible families of distributions able to fit complicated datasets of images or text. These models provide independent samples from complex high-distributions at negligible costs. On the other hand, sampling exactly a target distribution, such as Boltzmann distributions and Bayesian posteriors is typically challenging: either because of dimensionality, multi-modality, ill-conditioning or a combination of the previous. In this talk, I will review recent works trying to enhance traditional inference and sampling algorithms with learning. I will present in particular flowMC, an adaptive MCMC with Normalizing Flows along with first applications and remaining challenges.
[-]
Deep generative models parametrize very flexible families of distributions able to fit complicated datasets of images or text. These models provide independent samples from complex high-distributions at negligible costs. On the other hand, sampling exactly a target distribution, such as Boltzmann distributions and Bayesian posteriors is typically challenging: either because of dimensionality, multi-modality, ill-conditioning or a combination of ...
[+]
68T99 ; 82B80 ; 62F15