A survey of deep learning on mobile devices: Applications, optimizations, challenges, and research opportunities

T Zhao, Y Xie, Y Wang, J Cheng, X Guo… - Proceedings of the …, 2022 - ieeexplore.ieee.org
Deep learning (DL) has demonstrated great performance in various applications on
powerful computers and servers. Recently, with the advancement of more powerful mobile …

Deep learning on edge TPUs

Y Sun, AM Kist - arXiv preprint arXiv:2108.13732, 2021 - arxiv.org
Computing at the edge is important in remote settings, however, conventional hardware is
not optimized for utilizing deep neural networks. The Google Edge TPU is an emerging …

Anatomy of deep learning image classification and object detection on commercial edge devices: A case study on face mask detection

D Kolosov, V Kelefouras, P Kourtessis, I Mporas - IEEE Access, 2022 - ieeexplore.ieee.org
Developing efficient on-the-edge Deep Learning (DL) applications is a challenging and non-
trivial task, as first different DL models need to be explored with different trade-offs between …

Towards Efficient Convolutional Neural Network for Embedded Hardware via Multi-Dimensional Pruning

H Kong, D Liu, X Luo, S Huai… - 2023 60th ACM/IEEE …, 2023 - ieeexplore.ieee.org
In this paper, we propose TECO, a multi-dimensional pruning framework to collaboratively
prune the three dimensions (depth, width, and resolution) of convolutional neural networks …

Hardware Accelerators for Autonomous Cars: A Review

R Islayem, F Alhosani, R Hashem, A Alzaabi… - arXiv preprint arXiv …, 2024 - arxiv.org
Autonomous Vehicles (AVs) redefine transportation with sophisticated technology,
integrating sensors, cameras, and intricate algorithms. Implementing machine learning in AV …

D-STACK: High Throughput DNN Inference by Effective Multiplexing and Spatio-Temporal Scheduling of GPUs

A Dhakal, SG Kulkarni… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Hardware accelerators such as GPUs are required for real-time, low latency inference with
Deep Neural Networks (DNN). Providing inference services in the cloud can be resource …

Stress-Testing USB Accelerators for Efficient Edge Inference

A Van Der Staay, R Fischer… - 2024 IEEE/ACM …, 2024 - ieeexplore.ieee.org
Several manufacturers sell specialized USB devices for accelerating machine learning (ML)
on the edge. While being generally promoted as a versatile solution for more efficient edge …

Energy modeling of inference workloads with AI accelerators at the Edge: A benchmarking study

M Kasioulis, M Symeonides, G Ioannou… - 2024 IEEE …, 2024 - ieeexplore.ieee.org
Analyzing and modeling the performance and energy consumption of hybrid Edge
Computing systems with embedded devices and Artificial Intelligence (AI) accelerators is …

On Hardware-Aware Design and Optimization of Edge Intelligence

S Huai, H Kong, X Luo, D Liu… - IEEE Design & …, 2023 - ieeexplore.ieee.org
On Hardware-Aware Design and Optimization of Edge Intelligence Page 1 On Hardware-Aware
Design and Optimization of Edge Intelligence Shuo Huai∗†, Hao Kong∗†, Xiangzhong Luo∗ …

Deep Learning Techniques in Big Data-Enabled Internet-of-Things Devices

S Singh, S Sharma, S Bhadula - … Data Analytics in Fog-Enabled IoT …, 2023 - taylorfrancis.com
Due to the development in various tools and deep learning (DL) techniques that might be
helpful in evaluating Internet of Things (IoT) big data, the integration of the IoT with DL has …