When machine learning meets privacy: A survey and outlook

B Liu, M Ding, S Shaham, W Rahayu… - ACM Computing …, 2021 - dl.acm.org
The newly emerged machine learning (eg, deep learning) methods have become a strong
driving force to revolutionize a wide range of industries, such as smart healthcare, financial …

From distributed machine learning to federated learning: A survey

J Liu, J Huang, Y Zhou, X Li, S Ji, H Xiong… - … and Information Systems, 2022 - Springer
In recent years, data and computing resources are typically distributed in the devices of end
users, various regions or organizations. Because of laws or regulations, the distributed data …

Deep learning with label differential privacy

B Ghazi, N Golowich, R Kumar… - Advances in neural …, 2021 - proceedings.neurips.cc
Abstract The Randomized Response (RR) algorithm is a classical technique to improve
robustness in survey aggregation, and has been widely adopted in applications with …

Large scale private learning via low-rank reparametrization

D Yu, H Zhang, W Chen, J Yin… - … Conference on Machine …, 2021 - proceedings.mlr.press
We propose a reparametrization scheme to address the challenges of applying differentially
private SGD on large neural networks, which are 1) the huge memory cost of storing …

Muter: Machine unlearning on adversarially trained models

J Liu, M Xue, J Lou, X Zhang… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Machine unlearning is an emerging task of removing the influence of selected
training datapoints from a trained model upon data deletion requests, which echoes the …

Do not let privacy overbill utility: Gradient embedding perturbation for private learning

D Yu, H Zhang, W Chen, TY Liu - arXiv preprint arXiv:2102.12677, 2021 - arxiv.org
The privacy leakage of the model about the training data can be bounded in the differential
privacy mechanism. However, for meaningful privacy parameters, a differentially private …

Adaptive differential privacy in vertical federated learning for mobility forecasting

FZ Errounda, Y Liu - Future Generation Computer Systems, 2023 - Elsevier
Differential privacy is the de-facto technique for protecting the individuals in the training
dataset and the learning models in deep learning. However, the technique presents two …

Privacy-preserving collaborative learning with automatic transformation search

W Gao, S Guo, T Zhang, H Qiu… - Proceedings of the …, 2021 - openaccess.thecvf.com
Collaborative learning has gained great popularity due to its benefit of data privacy
protection: participants can jointly train a Deep Learning model without sharing their training …

Posthoc privacy guarantees for collaborative inference with modified Propose-Test-Release

A Singh, P Vepakomma, V Sharma… - Advances in Neural …, 2023 - proceedings.neurips.cc
Cloud-based machine learning inference is an emerging paradigm where users query by
sending their data through a service provider who runs an ML model on that data and …

Differentially private label protection in split learning

X Yang, J Sun, Y Yao, J Xie, C Wang - arXiv preprint arXiv:2203.02073, 2022 - arxiv.org
Split learning is a distributed training framework that allows multiple parties to jointly train a
machine learning model over vertically partitioned data (partitioned by attributes). The idea …