Soft errors in DNN accelerators: A comprehensive review

Y Ibrahim, H Wang, J Liu, J Wei, L Chen, P Rech… - Microelectronics …, 2020 - Elsevier
Deep learning tasks cover a broad range of domains and an even more extensive range of
applications, from entertainment to extremely safety-critical fields. Thus, Deep Neural …

Robust machine learning systems: Challenges, current trends, perspectives, and the road ahead

M Shafique, M Naseer, T Theocharides… - IEEE Design & …, 2020 - ieeexplore.ieee.org
Currently, machine learning (ML) techniques are at the heart of smart cyber-physical
systems (CPSs) and Internet-of-Things (loT). This article discusses various challenges and …

Timing attacks on machine learning: State of the art

M Kianpour, SF Wen - … Systems and Applications: Proceedings of the …, 2020 - Springer
Abstract Machine learning plays a significant role in today's business sectors and
governments, in which it is becoming more utilized as tools to help in decision making and …

Defending bit-flip attack through dnn weight reconstruction

J Li, AS Rakin, Y Xiong, L Chang, Z He… - 2020 57th ACM/IEEE …, 2020 - ieeexplore.ieee.org
Recent studies show that adversarial attacks on neural network weights, aka, Bit-Flip Attack
(BFA), can degrade Deep Neural Network's (DNN) prediction accuracy severely. In this …

Neuroattack: Undermining spiking neural networks security through externally triggered bit-flips

V Venceslai, A Marchisio, I Alouani… - … Joint Conference on …, 2020 - ieeexplore.ieee.org
Due to their proven efficiency, machine-learning systems are deployed in a wide range of
complex real-life problems. More specifically, Spiking Neural Networks (SNNs) emerged as …

[PDF][PDF] Concurrent weight encoding-based detection for bit-flip attack on neural network accelerators

Q Liu, W Wen, Y Wang - … on Computer-Aided Design (ICCAD), 2020, 2020 - par.nsf.gov
Abstract The recent revealed Bit-Flip Attack (BFA) against deep neural networks (DNNs) is
highly concerning, as it can completely mislead the inference of quantized DNNs by only …

Fadec: A fast decision-based attack for adversarial machine learning

F Khalid, H Ali, MA Hanif, S Rehman… - … Joint Conference on …, 2020 - ieeexplore.ieee.org
Due to the excessive use of cloud-based machine learning (ML) services, the smart cyber-
physical systems (CPS) are increasingly becoming vulnerable to black-box attacks on their …

Dependable deep learning: Towards cost-efficient resilience of deep neural network accelerators against soft errors and permanent faults

MA Hanif, M Shafique - … Symposium on On-Line Testing and …, 2020 - ieeexplore.ieee.org
Deep Learning has enabled machines to learn computational models (ie, Deep Neural
Networks-DNNs) that can perform certain complex tasks with claims to be close to human …

Overview of security for smart cyber-physical systems

F Khalid, S Rehman, M Shafique - Security of Cyber-Physical Systems …, 2020 - Springer
The tremendous growth of interconnectivity and dependencies of physical and cyber
domains in cyber-physical systems (CPS) makes them vulnerable to several security threats …

Facer: A universal framework for detecting anomalous operation of deep neural networks

C Schorn, L Gauerhof - 2020 IEEE 23rd International …, 2020 - ieeexplore.ieee.org
The detection of anomalies during the operation of deep neural networks (DNNs) is of
essential importance in safety-critical applications, such as autonomous vehicles. In the …