作者
Nicolas Papernot, Patrick McDaniel, Somesh Jha, Matt Fredrikson, Z Berkay Celik, Ananthram Swami
发表日期
2016/3/21
研讨会论文
2016 IEEE European symposium on security and privacy (EuroS&P)
页码范围
372-387
出版商
IEEE
简介
Deep learning takes advantage of large datasets and computationally efficient training algorithms to outperform other approaches at various machine learning tasks. However, imperfections in the training phase of deep neural networks make them vulnerable to adversarial samples: inputs crafted by adversaries with the intent of causing deep neural networks to misclassify. In this work, we formalize the space of adversaries against deep neural networks (DNNs) and introduce a novel class of algorithms to craft adversarial samples based on a precise understanding of the mapping between inputs and outputs of DNNs. In an application to computer vision, we show that our algorithms can reliably produce samples correctly classified by human subjects but misclassified in specific targets by a DNN with a 97% adversarial success rate while only modifying on average 4.02% of the input features per sample. We then …
引用总数
20162017201820192020202120222023202438152449591751870802767351
学术搜索中的文章
N Papernot, P McDaniel, S Jha, M Fredrikson, ZB Celik… - 2016 IEEE European symposium on security and …, 2016
N Papernot, P McDaniel, S Jha, M Fredrikson, ZB Celik… - 2016
N Papernot, PD McDaniel, S Jha, M Fredrikson… - arXiv preprint arXiv:1511.07528, 2015
N Papernot, P McDaniel, S Jha, M Fredrikson, ZB Celik… - arXiv preprint arXiv:1511.07528
N Papernot, P McDaniel, S Jha, M Fredrikson, ZB Celik… - arXiv preprint arXiv:1511.07528, 2015
N Papernot, P McDaniel, S Jha, M Fredrikson, ZB Celik… - arXiv preprint arXiv:1511.07528, 2015
N Papernot, P McDaniel, M Fredrikson, S Jha - arXiv preprint arXiv:1511.07528, 2015
SJMFZ Berkay, CASN Papernot, P McDaniel - 2016 IEEE European Symposium on Security and …, 2016
N Papernot, P McDaniel, S Jha, M Fredrikson, ZB Celik… - arXiv preprint arXiv:1511.07528, 2015
N Papernot, PD McDaniel, S Jha, M Fredrikson… - URL http://arxiv. org/abs/1511.07528