- Information Leakage of Upda...
- Edit Event
- Cancel Event
- Preview Reminder
- Send Reminder
- Other events happening in February 2021
Information Leakage of Updates to Natural Language Models
Speaker:
Olya Ohrimenko
, University of Melbourne
Date: Thursday, February 11, 2021
Time: 4:00 PM to 5:00 PM Note: all times are in the Eastern Time Zone
Public: Yes
Location:
Event Type: Seminar
Room Description:
Host: Srini Devadas
Contact: Kyle L Hogan, klhogan@csail.mit.edu
Relevant URL:
Speaker URL: None
Speaker Photo:
None
Reminders to:
seminars@csail.mit.edu, oohrimenko@unimelb.edu.au
Reminder Subject:
TALK: Information Leakage of Updates to Natural Language Models
NOTE THE UNUSUAL TIME
Abstract:
To continuously improve quality and reflect changes in data, machine learning applications must regularly retrain and update their core models. In this talk, I show that a differential analysis of language model snapshots before and after an update can reveal a surprising amount of detailed information about changes in the training data. To demonstrate the leakage due to updates of natural language models, we develop two metrics — differential score and differential rank. I will discuss the extent of leakage analysis possible using these metrics across a range of models and datasets. I will conclude with the privacy implications of our findings, propose mitigation strategies, and evaluate their effect.
This talk is based on the paper that appeared in ACM Conference on Computer and Communication Security (CCS) 2020.
Joint work with Santiago Zanella-Béguelin, Lukas Wutschitz, Shruti Tople, Victor Rühle, Andrew Paverd, Boris Köpf and Marc Brockschmidt from Microsoft.
Bio:
Olya Ohrimenko joined the University of Melbourne as a Senior Lecturer in 2020. Prior to that she was a Principal Researcher in Confidential Computing group at Microsoft Research in Cambridge, UK, that she joined in 2014 as a Postdoctoral Researcher. Her research interests include data privacy, integrity and security issues that emerge in the cloud computing environment and machine learning applications. She has co-organized workshops on privacy-preserving machine learning at security and machine learning venues including ACM CCS and NeurIPS. She currently holds two research grants from Facebook. She was a Research Fellow at Darwin College, the University of Cambridge in 2014 and 2015. Olya received her Ph.D. degree from Brown University in 2013 and a B.CS. (Hons) degree from the University of Melbourne in 2007.
Kyle Hogan is inviting you to a scheduled Zoom meeting.
Topic: CSAIL Security Seminar
Time: This is a recurring meeting Meet anytime
Join Zoom Meeting
https://mit.zoom.us/j/97527284254
Password: <3security
One tap mobile
+16465588656,,97527284254# US (New York)
+16699006833,,97527284254# US (San Jose)
Meeting ID: 975 2728 4254
US : +1 646 558 8656 or +1 669 900 6833
International Numbers: https://mit.zoom.us/u/auBvg4NEV
Join by SIP
97527284254@zoomcrc.com
Join by Skype for Business
https://mit.zoom.us/skype/97527284254
Research Areas:
AI & Machine Learning, Security & Cryptography
Impact Areas:
Cybersecurity
Created by Kyle L Hogan at Tuesday, February 02, 2021 at 4:12 PM.