Normalization techniques in training dnns: Methodology, analysis and application

L Huang, J Qin, Y Zhou, F Zhu, L Liu… - IEEE transactions on …, 2023 - ieeexplore.ieee.org
Normalization techniques are essential for accelerating the training and improving the
generalization of deep neural networks (DNNs), and have successfully been used in various …

Deep learning in neural networks: An overview

J Schmidhuber - Neural networks, 2015 - Elsevier
In recent years, deep artificial neural networks (including recurrent ones) have won
numerous contests in pattern recognition and machine learning. This historical survey …

[PDF][PDF] 深度学习研究进展.

刘建伟, 刘媛, 罗雄麟 - Application Research of Computers …, 2014 - researchgate.net
鉴于深度学习的重要性, 综述了深度学习的研究进展. 首先概述了深度学习具有的优点,
由此说明了引入深度学习的必要性; 然后描述了三种典型的深度学习模型 …

Fast underwater image enhancement for improved visual perception

MJ Islam, Y Xia, J Sattar - IEEE Robotics and Automation …, 2020 - ieeexplore.ieee.org
In this letter, we present a conditional generative adversarial network-based model for real-
time underwater image enhancement. To supervise the adversarial training, we formulate an …

Weight normalization: A simple reparameterization to accelerate training of deep neural networks

T Salimans, DP Kingma - Advances in neural information …, 2016 - proceedings.neurips.cc
We present weight normalization: a reparameterization of the weight vectors in a neural
network that decouples the length of those weight vectors from their direction. By …

Training very deep networks

RK Srivastava, K Greff… - Advances in neural …, 2015 - proceedings.neurips.cc
Theoretical and empirical evidence indicates that the depth of neural networks is crucial for
their success. However, training becomes more difficult as depth increases, and training of …

Highway networks

RK Srivastava, K Greff, J Schmidhuber - arXiv preprint arXiv:1505.00387, 2015 - arxiv.org
There is plenty of theoretical and empirical evidence that depth of neural networks is a
crucial ingredient for their success. However, network training becomes more difficult with …

[PDF][PDF] Batch normalization: Accelerating deep network training by reducing internal covariate shift

S Ioffe - arXiv preprint arXiv:1502.03167, 2015 - asvk.cs.msu.ru
Abstract Training Deep Neural Networks is complicated by the fact that the distribution of
each layer's inputs changes during training, as the parameters of the previous layers …

Delving deep into rectifiers: Surpassing human-level performance on imagenet classification

K He, X Zhang, S Ren, J Sun - Proceedings of the IEEE …, 2015 - openaccess.thecvf.com
Rectified activation units (rectifiers) are essential for state-of-the-art neural networks. In this
work, we study rectifier neural networks for image classification from two aspects. First, we …

Design principles for industrie 4.0 scenarios

M Hermann, T Pentek, B Otto - 2016 49th Hawaii international …, 2016 - ieeexplore.ieee.org
The increasing integration of the Internet of Everything into the industrial value chain has
built the foundation for the next industrial revolution called Industrie 4.0. Although Industrie …