Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
The human brain contains billions of neurones and glial cells that are tightly interconnected. Describing their electrical and chemical activity is mind-boggling hence the idea of studying the thermodynamic limit of the equations that describe these activities, i.e. to look at what happens when the number of cells grows arbitrarily large. It turns out that under reasonable hypotheses the number of equations to deal with drops down sharply from millions to a handful, albeit more complex. There are many different approaches to this which are usually called mean-field analyses. I present two mathematical methods to illustrate these approaches. They both enjoy the feature that they propagate chaos, a notion I connect to physiological measurements of the correlations between neuronal activities. In the first method, the limit equations can be read off the network equations and methods 'à la Sznitman' can be used to prove convergence and propagation of chaos as in the case of a network of biologically plausible neurone models. The second method requires more sophisticated tools such as large deviations to identify the limit and do the rest of the job, as in the case of networks of Hopfield neurones such as those present in the trendy deep neural networks.
[-]
The human brain contains billions of neurones and glial cells that are tightly interconnected. Describing their electrical and chemical activity is mind-boggling hence the idea of studying the thermodynamic limit of the equations that describe these activities, i.e. to look at what happens when the number of cells grows arbitrarily large. It turns out that under reasonable hypotheses the number of equations to deal with drops down sharply from ...
[+]
60F99 ; 60B10 ; 92B20 ; 82C32 ; 82C80 ; 35Q80