https://cdn.jwplayer.com/libraries/kxatZa2V.js CIRM - Videos & books Library - Learning with differentiable perturbed optimizers
En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK
1

Learning with differentiable perturbed optimizers

Bookmarks Report an error
Post-edited
Authors : Berthet, Quentin (Author of the conference)
CIRM (Publisher )

Loading the player...
supervised learning in ML perturbation methods learning with perturbed optimizers Fenchel-Young losses properties and regularity of the method classification on CIFAR-10 supervised learning to rank supervised shortest paths learning

Abstract : Machine learning pipelines often rely on optimization procedures to make discrete decisions (e.g. sorting, picking closest neighbors, finding shortest paths or optimal matchings). Although these discrete decisions are easily computed in a forward manner, they cannot be used to modify model parameters using first-order optimization techniques because they break the back-propagation of computational graphs. In order to expand the scope of learning problems that can be solved in an end-to-end fashion, we propose a systematic method to transform a block that outputs an optimal discrete decision into a differentiable operation. Our approach relies on stochastic perturbations of these parameters, and can be used readily within existing solvers without the need for ad hoc regularization or smoothing. These perturbed optimizers yield solutions that are differentiable and never locally constant. The amount of smoothness can be tuned via the chosen noise amplitude, whose impact we analyze. The derivatives of these perturbed solvers can be evaluated eciently. We also show how this framework can be connected to a family of losses developed in structured prediction, and describe how these can be used in unsupervised and supervised learning, with theoretical guarantees.
We demonstrate the performance of our approach on several machine learning tasks in experiments on synthetic and real data.

Keywords : perturbation methods; structured learning

MSC Codes :
62F99 - None of the above but in this section
68W20 - randomized algorithms
90C06 - Large-scale problems

Additional resources :
https://www.cirm-math.fr/RepOrga/2133/Slides/perturbations_berthet.pdf

    Information on the Video

    Film maker : Hennenfent, Guillaume
    Language : English
    Available date : 06/04/2020
    Conference Date : 09/03/2020
    Subseries : Research talks
    arXiv category : Machine Learning ; Optimization and Control
    Mathematical Area(s) : Computer Science ; Control Theory & Optimization
    Format : MP4 (.mp4) - HD
    Video Time : 00:50:11
    Targeted Audience : Researchers
    Download : https://videos.cirm-math.fr/20120-03-09_Berthet.mp4

Information on the Event

Event Title : Optimization for Machine Learning / Optimisation pour l'apprentissage automatique
Event Organizers : Boyer, Claire ; d'Aspremont, Alexandre ; Gramfort, Alexandre ; Salmon, Joseph ; Villar, Soledad
Dates : 09/03/2020 - 13/03/2020
Event Year : 2020
Event URL : https://conferences.cirm-math.fr/2133.html

Citation Data

DOI : 10.24350/CIRM.V.19622903
Cite this video as: Berthet, Quentin (2020). Learning with differentiable perturbed optimizers. CIRM. Audiovisual resource. doi:10.24350/CIRM.V.19622903
URI : http://dx.doi.org/10.24350/CIRM.V.19622903

See Also

Bibliography

  • PAPANDREOU, George et YUILLE, Alan L. Perturb-and-map random fields: Using discrete optimization to learn and sample from energy models. In : 2011 International Conference on Computer Vision. IEEE, 2011. p. 193-200. - https://doi.org/10.1109/ICCV.2011.6126242

  • KALAI, Adam et VEMPALA, Santosh. Efficient algorithms for online decision problems. In : Learning Theory and Kernel Machines. Springer, Berlin, Heidelberg, 2003. p. 26-40. - http://dx.doi.org/10.1007/978-3-540-45167-9_4

  • BERTHET, Quentin, BLONDEL, Mathieu, TEBOUL, Olivier, et al. Learning with Differentiable Perturbed Optimizers. arXiv preprint arXiv:2002.08676, 2020. - https://arxiv.org/abs/2002.08676



Bookmarks Report an error