Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Post-edited  Introduction aux processus de fragmentation - Partie 1 Haas, Bénédicte (Auteur de la Conférence) | CIRM (Editeur )

Les processus de fragmentation sont des modèles aléatoires pour décrire l’évolution d’objets (particules, masses) sujets à des fragmentations successives au cours du temps. L’étude de tels modèles remonte à Kolmogorov, en 1941, et ils ont depuis fait l’objet de nombreuses recherches. Ceci s’explique à la fois par de multiples motivations (le champs d’applications est vaste : biologie et génétique des populations, formation de planètes, polymérisation, aérosols, industrie minière, informatique, etc.) et par la mise en place de modèles mathématiques riches et liés à d’autres domaines bien développés en Probabilités, comme les marches aléatoires branchantes, les processus de Lévy et les arbres aléatoires. L’objet de ce mini-cours est de présenter les processus de fragmentation auto-similaires, tels qu’introduits par Bertoin au début des années 2000s. Ce sont des processus markoviens, dont la dynamique est caractérisée par une propriété de branchement (différents objets évoluent indépendamment) et une propriété d’auto-similarité (un objet se fragmente à un taux proportionnel à une certaine puissance fixée de sa masse). Nous discuterons la construction de ces processus (qui incluent des modèles avec fragmentations spontanées, plus délicats à construire) et ferons un tour d’horizon de leurs principales propriétés. Les processus de fragmentation sont des modèles aléatoires pour décrire l’évolution d’objets (particules, masses) sujets à des fragmentations successives au cours du temps. L’étude de tels modèles remonte à Kolmogorov, en 1941, et ils ont depuis fait l’objet de nombreuses recherches. Ceci s’explique à la fois par de multiples motivations (le champs d’applications est vaste : biologie et génétique des populations, formation de planètes, ...

#### Filtrer

##### Codes MSC

Z
count non-backtracking walks of a given length. It has been used recently in the context of community detection and has appeared previously in connection with the Ihara zeta function and in some generalizations of Ramanujan graphs. In this work, we ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Post-edited  Macroscopic fluctuation theory. Lecture 1: Particle systems, scaling limits and large deviations Gabrielli, Davide (Auteur de la Conférence) | CIRM (Editeur )

In this first lecture I will introduce a class of stochastic microscopic models very useful as toy models in non equilibrium statistical mechanics. These are multi-component stochastic particle systems like the exclusion process, the zero range process and the KMP model. I will discuss their scaling limits and the corresponding large deviations principles. Problems of interest are the computation of the current flowing across a system and the understanding of the structure of the stationary non equilibrium states. I will discuss these problems in specific examples and from two different perspectives. The stochastic microscopic and combinatorial point of view and the macroscopic variational approach where the microscopic details of the models are encoded just by the transport coefficients. In this first lecture I will introduce a class of stochastic microscopic models very useful as toy models in non equilibrium statistical mechanics. These are multi-component stochastic particle systems like the exclusion process, the zero range process and the KMP model. I will discuss their scaling limits and the corresponding large deviations principles. Problems of interest are the computation of the current flowing across a system and the ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Post-edited  The power of heterogeneous large-scale data for high-dimensional causal inference Bühlmann, Peter (Auteur de la Conférence) | CIRM (Editeur )

We present a novel methodology for causal inference based on an invariance principle. It exploits the advantage of heterogeneity in larger datasets, arising from different experimental conditions (i.e. an aspect of "Big Data"). Despite fundamental identifiability issues, the method comes with statistical confidence statements leading to more reliable results than alternative procedures based on graphical modeling. We also discuss applications in biology, in particular for large-scale gene knock-down experiments in yeast where computational and statistical methods have an interesting potential for prediction and prioritization of new experimental interventions. We present a novel methodology for causal inference based on an invariance principle. It exploits the advantage of heterogeneity in larger datasets, arising from different experimental conditions (i.e. an aspect of "Big Data"). Despite fundamental identifiability issues, the method comes with statistical confidence statements leading to more reliable results than alternative procedures based on graphical modeling. We also discuss applications in ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Post-edited  Statistical inverse problems and geometric "wavelet" construction Kerkyacharian, Gérard (Auteur de la Conférence) | CIRM (Editeur )

In the fist part of the talk, we will look to some statistical inverse problems for which the natural framework is no more an Euclidian one.
In the second part we will try to give the initial construction of (not orthogonal) wavelets -of the 80 - by Frazier, Jawerth,Weiss, before the Yves Meyer ORTHOGONAL wavelets theory.
In the third part we will propose a construction of a geometric wavelet theory. In the Euclidian case, Fourier transform plays a fundamental role. In the geometric situation this role is given to some "Laplacian operator" with some properties.
In the last part we will show that the previous theory could help to revisit the topic of regularity of Gaussian processes, and to give a criterium only based on the regularity of the covariance operator.
In the fist part of the talk, we will look to some statistical inverse problems for which the natural framework is no more an Euclidian one.
In the second part we will try to give the initial construction of (not orthogonal) wavelets -of the 80 - by Frazier, Jawerth,Weiss, before the Yves Meyer ORTHOGONAL wavelets theory.
In the third part we will propose a construction of a geometric wavelet theory. In the Euclidian case, Fourier transform ...

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Post-edited  Asymptotic behavior of the Laplacian quasi-maximum likelihood estimator of affine causal processes Bardet, Jean-Marc (Auteur de la Conférence) | CIRM (Editeur )

We prove the consistency and asymptotic normality of the Laplacian Quasi-Maximum Likelihood Estimator (QMLE) for a general class of causal time series including ARMA, AR($\infty$), GARCH, ARCH($\infty$), ARMA-GARCH, APARCH, ARMA-APARCH,..., processes. We notably exhibit the advantages (moment order and robustness) of this estimator compared to the classical Gaussian QMLE. Numerical simulations confirms the accuracy of this estimator.

Déposez votre fichier ici pour le déplacer vers cet enregistrement.

## Post-edited  Markov Chain Monte Carlo Methods - Part 1 Robert, Christian P. (Auteur de la Conférence) | CIRM (Editeur )

In this short course, we recall the basics of Markov chain Monte Carlo (Gibbs & Metropolis sampelrs) along with the most recent developments like Hamiltonian Monte Carlo, Rao-Blackwellisation, divide & conquer strategies, pseudo-marginal and other noisy versions. We also cover the specific approximate method of ABC that is currently used in many fields to handle complex models in manageable conditions, from the original motivation in population genetics to the several reinterpretations of the approach found in the recent literature. Time allowing, we will also comment on the programming developments like BUGS, STAN and Anglican that stemmed from those specific algorithms. In this short course, we recall the basics of Markov chain Monte Carlo (Gibbs & Metropolis sampelrs) along with the most recent developments like Hamiltonian Monte Carlo, Rao-Blackwellisation, divide & conquer strategies, pseudo-marginal and other noisy versions. We also cover the specific approximate method of ABC that is currently used in many fields to handle complex models in manageable conditions, from the original motivation in population ...