Hardware-aware approach to deep neural network optimization

H Li, L Meng - Neurocomputing, 2023 - Elsevier
Deep neural networks (DNNs) have been a pivotal technology in a myriad of fields, boasting
remarkable achievements. Nevertheless, their substantial workload and inherent …

PRF: deep neural network compression by systematic pruning of redundant filters

CH Sarvani, M Ghorai, SHS Basha - Neural Computing and Applications, 2024 - Springer
In deep neural networks, the filters of convolutional layers play an important role in
extracting the features from the input. Redundant filters often extract similar features, leading …

Attention-based adaptive structured continuous sparse network pruning

J Liu, W Liu, Y Li, J Hu, S Cheng, W Yang - Neurocomputing, 2024 - Elsevier
Deep neural network models, especially CNNs, have a wide range of applications in many
fields, but their high computational power requirements limit the deployment applications in …

Empirical evaluation of filter pruning methods for acceleration of convolutional neural network

D Kumar, MA Mehta, VC Joshi, RS Oza… - Multimedia Tools and …, 2024 - Springer
Training and inference of deep convolutional neural networks are usually slow due to the
depth of the network and the number of parameters in the network. Although high …

Green AI‐Driven Concept for the Development of Cost‐Effective and Energy‐Efficient Deep Learning Method: Application in the Detection of Eimeria Parasites as a …

SS Acmali, Y Ortakci, H Seker - Advanced Intelligent Systems, 2024 - Wiley Online Library
Although large‐scale pretrained convolutinal neural networks (CNN) models have shown
impressive transfer learning capabilities, they come with drawbacks such as high energy …

Optimization and deployment of dnns for risc-v-based edge ai

Z Su, Q Li, H Kaneko, H Li… - 2024 IEEE International …, 2024 - ieeexplore.ieee.org
Deploying Deep Neural Networks (DNNs) on edge devices to handle artificial intelligence
(AI) tasks is increasingly important, but this is often limited by the computational and energy …

On the ideal number of groups for isometric gradient propagation

BJ Kim, H Choi, H Jang, SW Kim - Neurocomputing, 2024 - Elsevier
Recently, various normalization layers have been proposed to stabilize the training of deep
neural networks. Among them, group normalization is a generalization of layer normalization …

Idesf: An Edge Computing-Oriented Filter Pruning Method Based on Indirect and Direct Evaluation Space Fusion

Z Chen, A Chen, X Tang, H Ke, Z Jiang - Available at SSRN 4423326 - papers.ssrn.com
At present, edge computing has attracted widespread attention because of its potential to
overcome the problems of high latency and high network occupancy in cloud computing, but …