How to dp-fy ml: A practical guide to machine learning with differential privacy

N Ponomareva, H Hazimeh, A Kurakin, Z Xu… - Journal of Artificial …, 2023 - jair.org
Abstract Machine Learning (ML) models are ubiquitous in real-world applications and are a
constant focus of research. Modern ML models have become more complex, deeper, and …

Anonymization techniques for privacy preserving data publishing: A comprehensive survey

A Majeed, S Lee - IEEE access, 2020 - ieeexplore.ieee.org
Anonymization is a practical solution for preserving user's privacy in data publishing. Data
owners such as hospitals, banks, social network (SN) service providers, and insurance …

Differentially private learning needs better features (or much more data)

F Tramer, D Boneh - arXiv preprint arXiv:2011.11660, 2020 - arxiv.org
We demonstrate that differentially private machine learning has not yet reached its" AlexNet
moment" on many canonical vision tasks: linear models trained on handcrafted features …

Evaluating differentially private machine learning in practice

B Jayaraman, D Evans - 28th USENIX Security Symposium (USENIX …, 2019 - usenix.org
Differential privacy is a strong notion for privacy that can be used to prove formal
guarantees, in terms of a privacy budget, ε, about how much information is leaked by a …

Differential privacy techniques for cyber physical systems: A survey

MU Hassan, MH Rehmani… - … Communications Surveys & …, 2019 - ieeexplore.ieee.org
Modern cyber physical systems (CPSs) has widely being used in our daily lives because of
development of information and communication technologies (ICT). With the provision of …

Descent-to-delete: Gradient-based methods for machine unlearning

S Neel, A Roth… - Algorithmic Learning …, 2021 - proceedings.mlr.press
We study the data deletion problem for convex models. By leveraging techniques from
convex optimization and reservoir sampling, we give the first data deletion algorithms that …

The discrete gaussian for differential privacy

CL Canonne, G Kamath… - Advances in Neural …, 2020 - proceedings.neurips.cc
A key tool for building differentially private systems is adding Gaussian noise to the output of
a function evaluated on a sensitive dataset. Unfortunately, using a continuous distribution …

Semi-supervised knowledge transfer for deep learning from private training data

N Papernot, M Abadi, U Erlingsson… - arXiv preprint arXiv …, 2016 - arxiv.org
Some machine learning applications involve training data that is sensitive, such as the
medical histories of patients in a clinical trial. A model may inadvertently and implicitly store …

The privacy onion effect: Memorization is relative

N Carlini, M Jagielski, C Zhang… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Machine learning models trained on private datasets have been shown to leak their
private data. Recent work has found that the average data point is rarely leaked---it is often …

The algorithmic foundations of differential privacy

C Dwork, A Roth - Foundations and Trends® in Theoretical …, 2014 - nowpublishers.com
The problem of privacy-preserving data analysis has a long history spanning multiple
disciplines. As electronic data about individuals becomes increasingly detailed, and as …