How to dp-fy ml: A practical guide to machine learning with differential privacy
Abstract Machine Learning (ML) models are ubiquitous in real-world applications and are a
constant focus of research. Modern ML models have become more complex, deeper, and …
constant focus of research. Modern ML models have become more complex, deeper, and …
Anonymization techniques for privacy preserving data publishing: A comprehensive survey
A Majeed, S Lee - IEEE access, 2020 - ieeexplore.ieee.org
Anonymization is a practical solution for preserving user's privacy in data publishing. Data
owners such as hospitals, banks, social network (SN) service providers, and insurance …
owners such as hospitals, banks, social network (SN) service providers, and insurance …
Differentially private learning needs better features (or much more data)
We demonstrate that differentially private machine learning has not yet reached its" AlexNet
moment" on many canonical vision tasks: linear models trained on handcrafted features …
moment" on many canonical vision tasks: linear models trained on handcrafted features …
Evaluating differentially private machine learning in practice
B Jayaraman, D Evans - 28th USENIX Security Symposium (USENIX …, 2019 - usenix.org
Differential privacy is a strong notion for privacy that can be used to prove formal
guarantees, in terms of a privacy budget, ε, about how much information is leaked by a …
guarantees, in terms of a privacy budget, ε, about how much information is leaked by a …
Differential privacy techniques for cyber physical systems: A survey
MU Hassan, MH Rehmani… - … Communications Surveys & …, 2019 - ieeexplore.ieee.org
Modern cyber physical systems (CPSs) has widely being used in our daily lives because of
development of information and communication technologies (ICT). With the provision of …
development of information and communication technologies (ICT). With the provision of …
Descent-to-delete: Gradient-based methods for machine unlearning
We study the data deletion problem for convex models. By leveraging techniques from
convex optimization and reservoir sampling, we give the first data deletion algorithms that …
convex optimization and reservoir sampling, we give the first data deletion algorithms that …
The discrete gaussian for differential privacy
CL Canonne, G Kamath… - Advances in Neural …, 2020 - proceedings.neurips.cc
A key tool for building differentially private systems is adding Gaussian noise to the output of
a function evaluated on a sensitive dataset. Unfortunately, using a continuous distribution …
a function evaluated on a sensitive dataset. Unfortunately, using a continuous distribution …
Semi-supervised knowledge transfer for deep learning from private training data
Some machine learning applications involve training data that is sensitive, such as the
medical histories of patients in a clinical trial. A model may inadvertently and implicitly store …
medical histories of patients in a clinical trial. A model may inadvertently and implicitly store …
The privacy onion effect: Memorization is relative
Abstract Machine learning models trained on private datasets have been shown to leak their
private data. Recent work has found that the average data point is rarely leaked---it is often …
private data. Recent work has found that the average data point is rarely leaked---it is often …
The algorithmic foundations of differential privacy
The problem of privacy-preserving data analysis has a long history spanning multiple
disciplines. As electronic data about individuals becomes increasingly detailed, and as …
disciplines. As electronic data about individuals becomes increasingly detailed, and as …