En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK
1

Large-scale machine learning and convex optimization 2/2

Sélection Signaler une erreur
Multi angle
Auteurs : Bach, Francis (Auteur de la conférence)
CIRM (Editeur )

Loading the player...

Résumé : Many machine learning and signal processing problems are traditionally cast as convex optimization problems. A common difficulty in solving these problems is the size of the data, where there are many observations ("large n") and each of these is large ("large p"). In this setting, online algorithms such as stochastic gradient descent which pass over the data only once, are usually preferred over batch algorithms, which require multiple passes over the data. Given n observations/iterations, the optimal convergence rates of these algorithms are $O(1/\sqrt{n})$ for general convex functions and reaches $O(1/n)$ for strongly-convex functions. In this tutorial, I will first present the classical results in stochastic approximation and relate them to classical optimization and statistics results. I will then show how the smoothness of loss functions may be used to design novel algorithms with improved behavior, both in theory and practice: in the ideal infinite-data setting, an efficient novel Newton-based stochastic approximation algorithm leads to a convergence rate of $O(1/n)$ without strong convexity assumptions, while in the practical finite-data setting, an appropriate combination of batch and online algorithms leads to unexpected behaviors, such as a linear convergence rate for strongly convex problems, with an iteration cost similar to stochastic gradient descent.

Codes MSC :
62L20 - Stochastic approximation
68T05 - Learning and adaptive systems
90C06 - Large-scale problems
90C25 - Convex programming

    Informations sur la Vidéo

    Réalisateur : Hennenfent, Guillaume
    Langue : Anglais
    Date de Publication : 19/02/16
    Date de Captation : 04/02/16
    Collection : Exposés de recherche
    Sous Collection : Research talks
    Catégorie arXiv : Computer Science ; Machine Learning ; Optimization and Control
    Domaine(s) : Probabilités & Statistiques ; Informatique ; Théorie du Contrôle & Optimisation
    Format : MP4 (.mp4) - HD
    Durée : 01:28:40
    Audience : Chercheurs
    Download : https://videos.cirm-math.fr/2016-02-04_Bach_part2.mp4

Informations sur la Rencontre

Nom de la Rencontre : Thematic month on statistics - Week 1: Statistical learning / Mois thématique sur les statistiques - Semaine 1 : apprentissage
Organisateurs de la Rencontre : Ghattas, Badih ; Ralaivola, Liva
Dates : 01/02/16 - 05/02/16
Année de la rencontre : 2016
URL de la Rencontre : http://conferences.cirm-math.fr/1615.html

Données de citation

DOI : 10.24350/CIRM.V.18920503
Citer cette vidéo: Bach, Francis (2016). Large-scale machine learning and convex optimization 2/2. CIRM. Audiovisual resource. doi:10.24350/CIRM.V.18920503
URI : http://dx.doi.org/10.24350/CIRM.V.18920503

Bibliographie



Sélection Signaler une erreur