En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK
1

Privacy in machine learning

Bookmarks Report an error
Multi angle
Authors : Cummings, Rachel (Author of the conference)
CIRM (Publisher )

Loading the player...

Abstract : Privacy concerns are becoming a major obstacle to using data in the way that we want. It's often unclear how current regulations should translate into technology, and the changing legal landscape surrounding privacy can cause valuable data to go unused. How can data scientists make use of potentially sensitive data, while providing rigorous privacy guarantees to the individuals who provided data? A growing literature on differential privacy has emerged in the last decade to address some of these concerns. Differential privacy is a parameterized notion of database privacy that gives a mathematically rigorous worst-case bound on the maximum amount of information that can be learned about any one individual's data from the output of a computation. Differential privacy ensures that if a single entry in the database were to be changed, then the algorithm would still have approximately the same distribution over outputs. In this talk, we will see the definition and properties of differential privacy; survey a theoretical toolbox of differentially private algorithms that come with a strong accuracy guarantee; and discuss recent applications of differential privacy in major technology companies and government organizations.

Keywords : differential privacy

MSC Codes :
62-02 - Research exposition (monographs, survey articles)
68-02 - Research exposition (monographs, survey articles)
68W40 - Analysis of algorithms
90-02 - Research exposition (monographs, survey articles)

    Information on the Video

    Film maker : Petit, Jean
    Language : English
    Available date : 21/06/2022
    Conference Date : 24/05/2022
    Subseries : Research School
    arXiv category : Machine Learning ; Statistics
    Mathematical Area(s) : Computer Science
    Format : MP4 (.mp4) - HD
    Video Time : 00:57:52
    Targeted Audience : Researchers ; Graduate Students ; Doctoral Students, Post-Doctoral Students
    Download : https://videos.cirm-math.fr/2022-05-24_Cummings.mp4

Information on the Event

Event Title : Theoretical Computer Science Spring School: Machine Learning / Ecole de Printemps d'Informatique Théorique : Apprentissage Automatique
Event Organizers : Cappé, Olivier ; Garivier, Aurélien ; Gribonval, Rémi ; Kaufmann, Emilie ; Vernade, Claire
Dates : 23/05/2022 - 27/05/2022
Event Year : 2022
Event URL : https://conferences.cirm-math.fr/2542.html

Citation Data

DOI : 10.24350/CIRM.V.19921503
Cite this video as: Cummings, Rachel (2022). Privacy in machine learning. CIRM. Audiovisual resource. doi:10.24350/CIRM.V.19921503
URI : http://dx.doi.org/10.24350/CIRM.V.19921503

See Also

Bibliography

  • Dwork, C., McSherry, F., Nissim, K., Smith, A. (2006). Calibrating Noise to Sensitivity in Private Data Analysis. In: Halevi, S., Rabin, T. (eds) Theory of Cryptography. TCC 2006. Lecture Notes in Computer Science, vol 3876. Springer, Berlin, Heidelberg - https://doi.org/10.1007/11681878_14

  • Cynthia Dwork and Aaron Roth. 2014. The Algorithmic Foundations of Differential Privacy. Found. Trends Theor. Comput. Sci. 9, 3–4 (August 2014), 211–407 - https://doi.org/10.1561/0400000042

  • F. McSherry and K. Talwar, "Mechanism Design via Differential Privacy," 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07), 2007, pp. 94-103 - https://doi.org/10.1109/FOCS.2007.66



Imagette Video

Bookmarks Report an error