Rethinking the performance comparison between SNNS and ANNS

L Deng, Y Wu, X Hu, L Liang, Y Ding, G Li, G Zhao, P Li… - Neural networks, 2020 - Elsevier
Artificial neural networks (ANNs), a popular path towards artificial intelligence, have
experienced remarkable success via mature models, various benchmarks, open-source …

Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks

M Zhang, J Wang, J Wu, A Belatreche… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Spiking neural networks (SNNs) use spatiotemporal spike patterns to represent and transmit
information, which are not only biologically realistic but also suitable for ultralow-power …

[PDF][PDF] Signed Neuron with Memory: Towards Simple, Accurate and High-Efficient ANN-SNN Conversion.

Y Wang, M Zhang, Y Chen, H Qu - IJCAI, 2022 - ijcai.org
Abstract Spiking Neural Networks (SNNs) are receiving increasing attention due to their
biological plausibility and the potential for ultra-low-power eventdriven neuromorphic …

Distracted driver detection by combining in-vehicle and image data using deep learning

F Omerustaoglu, CO Sakar, G Kar - Applied Soft Computing, 2020 - Elsevier
Distracted driving is among the most important reasons for traffic accidents today. Recently,
there is an increasing interest in building driver assistance systems that detect the actions of …

Supervised learning in multilayer spiking neural networks with spike temporal error backpropagation

X Luo, H Qu, Y Wang, Z Yi, J Zhang… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
The brain-inspired spiking neural networks (SNNs) hold the advantages of lower power
consumption and powerful computing capability. However, the lack of effective learning …

Fedmed: A federated learning framework for language modeling

X Wu, Z Liang, J Wang - Sensors, 2020 - mdpi.com
Federated learning (FL) is a privacy-preserving technique for training a vast amount of
decentralized data and making inferences on mobile devices. As a typical language …

Innovative BERT-based reranking language models for speech recognition

SH Chiu, B Chen - 2021 IEEE Spoken Language Technology …, 2021 - ieeexplore.ieee.org
More recently, Bidirectional Encoder Representations from Transformers (BERT) was
proposed and has achieved impressive success on many natural language processing …

Bayesian neural network language modeling for speech recognition

B Xue, S Hu, J Xu, M Geng, X Liu… - IEEE/ACM Transactions …, 2022 - ieeexplore.ieee.org
State-of-the-art neural network language models (NNLMs) represented by long short term
memory recurrent neural networks (LSTM-RNNs) and Transformers are becoming highly …

A novel hybrid deep learning model for sugar price forecasting based on time series decomposition

J Zhang, Y Meng, J Wei, J Chen… - … Problems in Engineering, 2021 - Wiley Online Library
Sugar price forecasting has attracted extensive attention from policymakers due to its
significant impact on people's daily lives and markets. In this paper, we present a novel …

Bayesian transformer language models for speech recognition

B Xue, J Yu, J Xu, S Liu, S Hu, Z Ye… - ICASSP 2021-2021 …, 2021 - ieeexplore.ieee.org
State-of-the-art neural language models (LMs) represented by Transformers are highly
complex. Their use of fixed, deterministic parameter estimates fail to account for model …