Clip Body and Tail Separately: High Probability Guarantees for DPSGD with Heavy Tails

H Sha, Y Cao, Y Liu, Y Wu, R Liu, H Chen - arXiv preprint arXiv …, 2024 - arxiv.org
Differentially Private Stochastic Gradient Descent (DPSGD) is widely utilized to preserve
training data privacy in deep learning, which first clips the gradients to a predefined norm …

Rethinking DP-SGD in Discrete Domain: Exploring Logistic Distribution in the Realm of signSGD

J Jang, S Hwang, HJ Yang - Forty-first International Conference on … - openreview.net
Deep neural networks (DNNs) have a risk of remembering sensitive data from their training
datasets, inadvertently leading to substantial information leakage through privacy attacks …