An updated survey of efficient hardware architectures for accelerating deep convolutional neural networks
Deep Neural Networks (DNNs) are nowadays a common practice in most of the Artificial
Intelligence (AI) applications. Their ability to go beyond human precision has made these …
Intelligence (AI) applications. Their ability to go beyond human precision has made these …
Digital twin-supported smart city: Status, challenges and future research directions
A city can be considered a carrier of multiple sources of data and information that are
updated in real time and experiences continuous operation and development. Therefore, a …
updated in real time and experiences continuous operation and development. Therefore, a …
Hardware and software optimizations for accelerating deep neural networks: Survey of current trends, challenges, and the road ahead
Currently, Machine Learning (ML) is becoming ubiquitous in everyday life. Deep Learning
(DL) is already present in many applications ranging from computer vision for medicine to …
(DL) is already present in many applications ranging from computer vision for medicine to …
ALWANN: Automatic layer-wise approximation of deep neural network accelerators without retraining
The state-of-the-art approaches employ approximate computing to reduce the energy
consumption of DNN hardware. Approximate DNNs then require extensive retraining …
consumption of DNN hardware. Approximate DNNs then require extensive retraining …
Deep learning for edge computing: Current trends, cross-layer optimizations, and open research challenges
In the Machine Learning era, Deep Neural Networks (DNNs) have taken the spotlight, due to
their unmatchable performance in several applications, such as image processing, computer …
their unmatchable performance in several applications, such as image processing, computer …
A survey on quantum machine learning: Current trends, challenges, opportunities, and the road ahead
K Zaman, A Marchisio, MA Hanif… - arXiv preprint arXiv …, 2023 - arxiv.org
Quantum Computing (QC) claims to improve the efficiency of solving complex problems,
compared to classical computing. When QC is applied to Machine Learning (ML) …
compared to classical computing. When QC is applied to Machine Learning (ML) …
Neuroattack: Undermining spiking neural networks security through externally triggered bit-flips
V Venceslai, A Marchisio, I Alouani… - … Joint Conference on …, 2020 - ieeexplore.ieee.org
Due to their proven efficiency, machine-learning systems are deployed in a wide range of
complex real-life problems. More specifically, Spiking Neural Networks (SNNs) emerged as …
complex real-life problems. More specifically, Spiking Neural Networks (SNNs) emerged as …
CANN: Curable approximations for high-performance deep neural network accelerators
Approximate Computing (AC) has emerged as a means for improving the performance, area
and power-/energy-efficiency of a digital design at the cost of output quality degradation …
and power-/energy-efficiency of a digital design at the cost of output quality degradation …
CompAct: on-chip com pression of act ivations for low power systolic array based CNN acceleration
This paper addresses the design of systolic array (SA) based convolutional neural network
(CNN) accelerators for mobile and embedded domains. On-and off-chip memory accesses …
(CNN) accelerators for mobile and embedded domains. On-and off-chip memory accesses …
FEECA: Design space exploration for low-latency and energy-efficient capsule network accelerators
In the past few years, Capsule Networks (CapsNets) have taken the spotlight compared to
traditional convolutional neural networks (CNNs) for image classification. Unlike CNNs …
traditional convolutional neural networks (CNNs) for image classification. Unlike CNNs …