En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK
1

A review of different geometries for the training of neural networks

Sélection Signaler une erreur
Multi angle
Auteurs : Malago, Luigi (Auteur de la Conférence)
CIRM (Editeur )

Loading the player...

Résumé : Neural networks consist of a variegate class of computational models, used in machine learning for both supervised and unsupervised learning. Several topologies of networks have been proposed in the literature, since the preliminary work from the late 50s, including models based on undirected probabilistic graphical models, such as (Restricted) Boltzmann Machines, and on multi-layer feed-forward computational graphs. The training of a neural network is usually performed by the minimization of a cost function, such as the negative log-likelihood. During the talk we will review alternative geometries used to describe the space of the functions encoded by a neural network, parametrized by its connection weights, and the implications on the optimization of the cost function during training, from the perspective of Riemannian optimization. In the first part of the presentation, we will introduce a probabilistic interpretation for neural networks, which goes back to the work of Amari and coauthors from the 90s, and which is based on the use of the Fisher-Rao metric studied in Information Geometry. In this framework, the weights of a Boltzmann Machine, and similarly for feed-forward neural networks, are interpreted as the parameters of a (joint) statistical model for the observed, and possibly latent, variables. In the second part of the talk, we will review other approaches, motivated by invariant principles in neural networks and not explicitly based on probabilistic models, to the definition of alternative geometries for the space of the parameters of a neural network. The use of alternative non-Euclidean geometries has direct impact on the training algorithms, indeed by modeling the space of the functions associated to a neural network as a Riemannian manifold determines a dependence of the gradient on the choice of metric tensor. We conclude the presentation by reviewing some recently proposed training algorithm for neural networks, based on Riemannian optimization algorithms.

Codes MSC :
53B21 - Methods of Riemannian geometry
65K10 - Optimization and variational techniques
68T05 - Learning and adaptive systems
92B20 - Neural networks, artificial life and related topics

Ressources complémentaires :
http://forum.cs-dc.org/topic/559/luigi-malag%C3%B2-a-review-of-different-geometries-for-the-training-of-neural-networks

    Informations sur la Vidéo

    Réalisateur : Hennenfent, Guillaume
    Langue : Anglais
    Date de publication : 08/09/17
    Date de captation : 30/08/17
    Sous collection : Research talks
    arXiv category : Machine Learning
    Domaine : Computer Science ; Geometry ; Numerical Analysis & Scientific Computing
    Format : MP4 (.mp4) - HD
    Durée : 01:14:07
    Audience : Researchers
    Download : https://videos.cirm-math.fr/2017-08-30_Malago.mp4

Informations sur la Rencontre

Nom de la rencontre : Geometrical and topological structures of information / Structures géométriques et topologiques de l'information
Organisateurs de la rencontre : Ay, Nihat ; Baudot, Pierre ; Barbaresco, Frédéric ; Bennequin, Daniel ; Madiman, Mokshay ; Nielsen, Frank
Dates : 28/08/17 - 01/09/17
Année de la rencontre : 2017
URL Congrès : http://conferences.cirm-math.fr/1680.html

Données de citation

DOI : 10.24350/CIRM.V.19219703
Citer cette vidéo: Malago, Luigi (2017). A review of different geometries for the training of neural networks. CIRM. Audiovisual resource. doi:10.24350/CIRM.V.19219703
URI : http://dx.doi.org/10.24350/CIRM.V.19219703

Voir aussi

Bibliographie



Sélection Signaler une erreur