En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK

Documents 60J20 5 results

Filter
Select: All / None
Q
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
2y
Markov chain Monte Carlo methods have become ubiquitous across science and engineering to model dynamics and explore large combinatorial sets. Over the last 20 years there have been tremendous advances in the design and analysis of efficient sampling algorithms for this purpose. One of the striking discoveries has been the realization that many natural Markov chains undergo phase transitions, whereby they abruptly change from being efficient to inefficient as some parameter of the system is modified. Generating functions can offer an alternative approach to sampling and they play a role in showing when certain Markov chains are efficient or not. We will explore the interplay between Markov chains, generating functions, and phase transitions for a variety of combinatorial problems, including graded posets, Boltzmann sampling, and 3-colorings on $Z^{2}$.[-]
Markov chain Monte Carlo methods have become ubiquitous across science and engineering to model dynamics and explore large combinatorial sets. Over the last 20 years there have been tremendous advances in the design and analysis of efficient sampling algorithms for this purpose. One of the striking discoveries has been the realization that many natural Markov chains undergo phase transitions, whereby they abruptly change from being efficient to ...[+]

60C05 ; 68R05 ; 60J20

Bookmarks Report an error
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Low-dimensional compartment models for biological systems can be fitted to time series data using Monte Carlo particle filter methods. As dimension increases, for example when analyzing a collection of spatially coupled populations, particle filter methods rapidly degenerate. We show that many independent Monte Carlo calculations, each of which does not attempt to solve the filtering problem, can be combined to give a global filtering solution with favorable theoretical scaling properties under a weak coupling condition. The independent Monte Carlo calculations are called islands, and the operation carried out on each island is called adapted simulation, so the complete algorithm is called an adapted simulation island filter. We demonstrate this methodology and some related algorithms on a model for measles transmission within and between cities.[-]
Low-dimensional compartment models for biological systems can be fitted to time series data using Monte Carlo particle filter methods. As dimension increases, for example when analyzing a collection of spatially coupled populations, particle filter methods rapidly degenerate. We show that many independent Monte Carlo calculations, each of which does not attempt to solve the filtering problem, can be combined to give a global filtering solution ...[+]

60G35 ; 60J20 ; 62M02 ; 62M05 ; 62M20 ; 62P10 ; 65C35

Bookmarks Report an error
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y

Une histoire de mots inattendus et de génomes - Schbath, Sophie (Author of the conference) | CIRM H

Multi angle

Dans une première partie, je présenterai différentes problématiques liées à des statistiques d'occurrences de mots dans des génomes et décortiquerai plus en détail la question de savoir comment détecter si un mot a une fréquence d'apparition significativement anormale dans une séquence. Dans une deuxième partie, je présenterai différentes extensions pour tenir compte du fait qu'un motif d'ADN fonctionnel n'est pas toujours un « mot », mais qu'il peut avoir une structure plus complexe qui nécessite le développement de nouvelles méthodes statistiques.[-]
Dans une première partie, je présenterai différentes problématiques liées à des statistiques d'occurrences de mots dans des génomes et décortiquerai plus en détail la question de savoir comment détecter si un mot a une fréquence d'apparition significativement anormale dans une séquence. Dans une deuxième partie, je présenterai différentes extensions pour tenir compte du fait qu'un motif d'ADN fonctionnel n'est pas toujours un « mot », mais qu'il ...[+]

92C40 ; 62P10 ; 60J20 ; 92C42

Bookmarks Report an error
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y

Multi-armed bandits and beyond - Agrawal, Shipra (Author of the conference) | CIRM H

Multi angle

In this tutorial I will discuss recent advances in theory of multi-armed bandits and reinforcement learning, in particular the upper confidence bound (UCB) and Thompson Sampling (TS) techniques for algorithm design and analysis.

60J20 ; 68Q32 ; 68T05

Bookmarks Report an error
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Learning neural networks using only a small amount of data is an important ongoing research topic with tremendous potential for applications. We introduce a regularizer for the variational modeling of inverse problems in imaging based on normalizing flows, called patchNR. It involves a normalizing flow learned on patches of very few images. The subsequent reconstruction method is completely unsupervised and the same regularizer can be used for different forward operators acting on the same class of images.
By investigating the distribution of patches versus those of the whole image class, we prove that our variational model is indeed a MAP approach. Numerical examples for low-dose CT, limited-angle CT and superresolution of material images demonstrate that our method provides high quality results among unsupervised methods, but requires only very few data. Further, the appoach also works if only the low resolution image is available.
In the second part of the talk I will generalize normalizing flows to stochastic normalizing flows to improve their expressivity.Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. A unified framework to handle these approaches appear to be Markov chains. We consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties and show how many state-of-the-art models for data generation fit into this framework. Indeed including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables us to couple both deterministic layers as invertible neural networks and stochastic layers as Metropolis-Hasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. Our framework establishes a useful mathematical tool to combine the various approaches.
Joint work with F. Altekrüger, A. Denker, P. Hagemann, J. Hertrich, P. Maass[-]
Learning neural networks using only a small amount of data is an important ongoing research topic with tremendous potential for applications. We introduce a regularizer for the variational modeling of inverse problems in imaging based on normalizing flows, called patchNR. It involves a normalizing flow learned on patches of very few images. The subsequent reconstruction method is completely unsupervised and the same regularizer can be used for ...[+]

62F15 ; 60J20 ; 60J22 ; 65C05 ; 65C40 ; 68T07

Bookmarks Report an error