A reconfigurable multi-precision quantization-aware nonlinear activation function hardware module for DNNs

Q Hong, Z Liu, Q Long, H Tong, T Zhang, X Zhu… - Microelectronics …, 2024 - Elsevier
In recent years, the increasing variety of nonlinear activation functions (NAFs) in deep neural
networks (DNNs) has led to higher computational demands. However, hardware …

A high performance reconfigurable hardware architecture for lightweight convolutional neural network

F An, L Wang, X Zhou - Electronics, 2023 - mdpi.com
Since the lightweight convolutional neural network EfficientNet was proposed by Google in
2019, the series of models have quickly become very popular due to their superior …

Low-area architecture design of multi-mode activation functions with controllable maximum absolute error for neural network applications

SY Lin, JC Chiang - Microprocessors and Microsystems, 2023 - Elsevier
In the development of the neural network (NN), the activation function has become more and
more important. The selection of the activation function indirectly affects the convergence …

HENCE: Hardware End-to-End Neural Conditional Entropy Encoder for Lossless 3D Medical Image Compression

J Chen, Q Chen, H Zhang, W Chen, W Luo, F Yu - IEEE Access, 2024 - ieeexplore.ieee.org
Recently, learning-based lossless compression methods for volumetric medical images
have attracted much attention. They can achieve higher compression ratios than traditional …

Efficient Nonlinear Function Approximation in Analog Resistive Crossbars for Recurrent Neural Networks

J Yang, R Mao, M Jiang, Y Cheng, PSV Sun… - arXiv preprint arXiv …, 2024 - arxiv.org
Analog In-memory Computing (IMC) has demonstrated energy-efficient and low latency
implementation of convolution and fully-connected layers in deep neural networks (DNN) by …

ReAFM: A reconfigurable nonlinear activation function module for neural networks

X Wu, S Liang, M Wang, Z Wang - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Deep neural networks (DNNs) with various nonlinear activation functions (NAFs) have
achieved unprecedented successes, sparking interest in efficient DNN hardware …

A survey on FPGA-based accelerator for ML models

F Yan, A Koch, O Sinnen - arXiv preprint arXiv:2412.15666, 2024 - arxiv.org
This paper thoroughly surveys machine learning (ML) algorithms acceleration in hardware
accelerators, focusing on Field-Programmable Gate Arrays (FPGAs). It reviews 287 out of …

GEBA: Gradient-Error-Based Approximation of Activation Functions

C Ye, DS Jeong - IEEE Journal on Emerging and Selected …, 2023 - ieeexplore.ieee.org
Computing-in-memory (CIM) macros aiming at accelerating deep learning operations at low
power need activation function (AF) units on the same die to reduce their host-dependency …

Hardware-Friendly Activation Function Designs and Its Efficient VLSI Implementations for Transformer-Based Applications

YH Huang, PH Kuo, JD Huang - 2023 IEEE 5th International …, 2023 - ieeexplore.ieee.org
The activation function is one of key elements in modern machine learning algorithms.
However, some broadly-used activation functions are exceptionally complex, eg, GELU in …

Precision and Power Efficient Piece-Wise-Linear Implementation of Transcendental Functions

R Kanish, OG Ratnaparkhi… - 2024 27th Euromicro …, 2024 - ieeexplore.ieee.org
The proposed piece-wise-linear (PWL) method utilizes the Method of Least Squares to
implement transcendental functions such as Sigmoid and Hyperbolic-Tangent with control …