En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK
1

Privacy in machine learning

Sélection Signaler une erreur
Multi angle
Auteurs : Cummings, Rachel (Auteur de la Conférence)
CIRM (Editeur )

Loading the player...

Résumé : Privacy concerns are becoming a major obstacle to using data in the way that we want. It's often unclear how current regulations should translate into technology, and the changing legal landscape surrounding privacy can cause valuable data to go unused. How can data scientists make use of potentially sensitive data, while providing rigorous privacy guarantees to the individuals who provided data? A growing literature on differential privacy has emerged in the last decade to address some of these concerns. Differential privacy is a parameterized notion of database privacy that gives a mathematically rigorous worst-case bound on the maximum amount of information that can be learned about any one individual's data from the output of a computation. Differential privacy ensures that if a single entry in the database were to be changed, then the algorithm would still have approximately the same distribution over outputs. In this talk, we will see the definition and properties of differential privacy; survey a theoretical toolbox of differentially private algorithms that come with a strong accuracy guarantee; and discuss recent applications of differential privacy in major technology companies and government organizations.

Keywords : differential privacy

Codes MSC :
62-02 - Research exposition (monographs, survey articles)
68-02 - Research exposition (monographs, survey articles)
68W40 - Analysis of algorithms
90-02 - Research exposition (monographs, survey articles)

    Informations sur la Vidéo

    Réalisateur : Petit, Jean
    Langue : Anglais
    Date de publication : 21/06/2022
    Date de captation : 24/05/2022
    Sous collection : Research School
    arXiv category : Machine Learning ; Statistics
    Domaine : Computer Science
    Format : MP4 (.mp4) - HD
    Durée : 00:57:52
    Audience : Researchers ; Graduate Students ; Doctoral Students, Post-Doctoral Students
    Download : https://videos.cirm-math.fr/2022-05-24_Cummings.mp4

Informations sur la Rencontre

Nom de la rencontre : Theoretical Computer Science Spring School: Machine Learning / Ecole de Printemps d'Informatique Théorique : Apprentissage Automatique
Organisateurs de la rencontre : Cappé, Olivier ; Garivier, Aurélien ; Gribonval, Rémi ; Kaufmann, Emilie ; Vernade, Claire
Dates : 23/05/2022 - 27/05/2022
Année de la rencontre : 2022
URL Congrès : https://conferences.cirm-math.fr/2542.html

Données de citation

DOI : 10.24350/CIRM.V.19921503
Citer cette vidéo: Cummings, Rachel (2022). Privacy in machine learning. CIRM. Audiovisual resource. doi:10.24350/CIRM.V.19921503
URI : http://dx.doi.org/10.24350/CIRM.V.19921503

Voir aussi

Bibliographie

  • Dwork, C., McSherry, F., Nissim, K., Smith, A. (2006). Calibrating Noise to Sensitivity in Private Data Analysis. In: Halevi, S., Rabin, T. (eds) Theory of Cryptography. TCC 2006. Lecture Notes in Computer Science, vol 3876. Springer, Berlin, Heidelberg - https://doi.org/10.1007/11681878_14

  • Cynthia Dwork and Aaron Roth. 2014. The Algorithmic Foundations of Differential Privacy. Found. Trends Theor. Comput. Sci. 9, 3–4 (August 2014), 211–407 - https://doi.org/10.1561/0400000042

  • F. McSherry and K. Talwar, "Mechanism Design via Differential Privacy," 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07), 2007, pp. 94-103 - https://doi.org/10.1109/FOCS.2007.66



Imagette Video

Sélection Signaler une erreur