En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK

Documents Kon Kam King, Guillaume 6 résultats

Filtrer
Sélectionner : Tous / Aucun
Q
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
​L'intérêt pour l'intelligence artificielle (IA) s'est considérablement accru ces dernières années et l'IA a été appliquée avec succès à des problèmes de société. Le Big Data, le recueil et l'analyse des données, la statistique se penchent sur l'amélioration de la société de demain. Big Data en santé publique, dans le domaine de la justice pénale, de la sécurité aéroportuaire, des changements climatiques, de la protection des espèces en voie de disparition, etc.

​C'est sur ces grands défis actuels et à venir que se penche Kerrie Mengersen, statisticienne australienne en résidence pour six mois au Cirm-Luminy (titulaire de la Chaire Jean-Morlet), aux côtés de Pierre Pudlo, Mathématicien à Aix-Marseille Université.

​La Chaire Jean-Morlet et le Cirm profitent de la richesse scientifique de cette résidence de chercheurs pour proposer une conférence à destination des lycéens et étudiants : seront ainsi abordées les différentes problématiques pour lesquelles l'intelligence artificielle et le big data jouent un rôle considérable.[-]
​L'intérêt pour l'intelligence artificielle (IA) s'est considérablement accru ces dernières années et l'IA a été appliquée avec succès à des problèmes de société. Le Big Data, le recueil et l'analyse des données, la statistique se penchent sur l'amélioration de la société de demain. Big Data en santé publique, dans le domaine de la justice pénale, de la sécurité aéroportuaire, des changements climatiques, de la protection des espèces en voie de ...[+]

68Txx ; 62-07

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Ill-posed linear inverse problems that combine knowledge of the forward measurement model with prior models arise frequently in various applications, from computational photography to medical imaging. Recent research has focused on solving these problems with score-based generative models (SGMs) that produce perceptually plausible images, especially in inpainting problems. In this study, we exploit the particular structure of the prior defined in the SGM to formulate recovery in a Bayesian framework as a Feynman–Kac model adapted from the forward diffusion model used to construct score-based diffusion. To solve this Feynman–Kac problem, we propose the use of Sequential Monte Carlo methods. The proposed algorithm, MCGdiff, is shown to be theoretically grounded and we provide numerical simulations showing that it outperforms competing baselines when dealing with ill-posed inverse problems.[-]
Ill-posed linear inverse problems that combine knowledge of the forward measurement model with prior models arise frequently in various applications, from computational photography to medical imaging. Recent research has focused on solving these problems with score-based generative models (SGMs) that produce perceptually plausible images, especially in inpainting problems. In this study, we exploit the particular structure of the prior defined ...[+]

62F15 ; 65C05 ; 65C60

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
The Bayesian approach to inference is based on a coherent probabilistic framework that naturally leads to principled uncertainty quantification and prediction. Via posterior distributions, Bayesian nonparametric models make inference on parameters belonging to infinite-dimensional spaces, such as the space of probability distributions. The development of Bayesian nonparametrics has been triggered by the Dirichlet process, a nonparametric prior that allows one to learn the law of the observations through closed-form expressions. Still, its learning mechanism is often too simplistic and many generalizations have been proposed to increase its flexibility, a popular one being the class of normalized completely random measures. Here we investigate a simple yet fundamental matter: will a different prior actually guarantee a different learning outcome? To this end, we develop a new distance between completely random measures based on optimal transport, which provides an original framework for quantifying the similarity between posterior distributions (merging of opinions). Our findings provide neat and interpretable insights on the impact of popular Bayesian nonparametric priors, avoiding the usual restrictive assumptions on the data-generating process. This is joint work with Hugo Lavenant.[-]
The Bayesian approach to inference is based on a coherent probabilistic framework that naturally leads to principled uncertainty quantification and prediction. Via posterior distributions, Bayesian nonparametric models make inference on parameters belonging to infinite-dimensional spaces, such as the space of probability distributions. The development of Bayesian nonparametrics has been triggered by the Dirichlet process, a nonparametric prior ...[+]

60G55 ; 60G57 ; 49Q22 ; 62C10

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
This course will provide a general introduction to SMC algorithms, from basic particle filters and their uses in state-space (hidden Markov) modelling in various areas, to more advanced algorithms such as SMC samplers, which may be used to sample from one, or several target distributions. The course will cover “a bit of everything”: theory (using Feynman-Kac models as a general framework), methodology (how to construct better algorithms in practice), implementation (examples in Python based on the library particles will be showcased), and applications.[-]
This course will provide a general introduction to SMC algorithms, from basic particle filters and their uses in state-space (hidden Markov) modelling in various areas, to more advanced algorithms such as SMC samplers, which may be used to sample from one, or several target distributions. The course will cover “a bit of everything”: theory (using Feynman-Kac models as a general framework), methodology (how to construct better algorithms in ...[+]

Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
Sélection Signaler une erreur
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
This course will provide a general introduction to SMC algorithms, from basic particle filters and their uses in state-space (hidden Markov) modelling in various areas, to more advanced algorithms such as SMC samplers, which may be used to sample from one, or several target distributions. The course will cover “a bit of everything”: theory (using Feynman-Kac models as a general framework), methodology (how to construct better algorithms in practice), implementation (examples in Python based on the library particles will be showcased), and applications.[-]
This course will provide a general introduction to SMC algorithms, from basic particle filters and their uses in state-space (hidden Markov) modelling in various areas, to more advanced algorithms such as SMC samplers, which may be used to sample from one, or several target distributions. The course will cover “a bit of everything”: theory (using Feynman-Kac models as a general framework), methodology (how to construct better algorithms in ...[+]

Sélection Signaler une erreur