Near lossless transfer learning for spiking neural networks

Z Yan, J Zhou, WF Wong - Proceedings of the AAAI conference on …, 2021 - ojs.aaai.org
Proceedings of the AAAI conference on artificial intelligence, 2021ojs.aaai.org
Spiking neural networks (SNNs) significantly reduce energy consumption by replacing
weight multiplications with additions. This makes SNNs suitable for energy-constrained
platforms. However, due to its discrete activation, training of SNNs remains a challenge. A
popular approach is to first train an equivalent CNN using traditional backpropagation, and
then transfer the weights to the intended SNN. Unfortunately, this often results in significant
accuracy loss, especially in deeper networks. In this paper, we propose CQ training …
Abstract
Spiking neural networks (SNNs) significantly reduce energy consumption by replacing weight multiplications with additions. This makes SNNs suitable for energy-constrained platforms. However, due to its discrete activation, training of SNNs remains a challenge. A popular approach is to first train an equivalent CNN using traditional backpropagation, and then transfer the weights to the intended SNN. Unfortunately, this often results in significant accuracy loss, especially in deeper networks. In this paper, we propose CQ training (Clamped and Quantized training), an SNN-compatible CNN training algorithm with clamp and quantization that achieves near-zero conversion accuracy loss. Essentially, CNN training in CQ training accounts for certain SNN characteristics. Using a 7 layer VGG-* and a 21 layer VGG-19, running on the CIFAR-10 dataset, we achieved 94.16% and 93.44% accuracy in the respective equivalent SNNs. It outperforms other existing comparable works that we know of. We also demonstrate the low-precision weight compatibility for the VGG-19 structure. Without retraining, an accuracy of 93.43% and 92.82% using quantized 9-bit and 8-bit weights, respectively, was achieved. The framework was developed in PyTorch and is publicly available.
ojs.aaai.org
以上显示的是最相近的搜索结果。 查看全部搜索结果