[HTML][HTML] Sparsity through evolutionary pruning prevents neuronal networks from overfitting
Modern Machine learning techniques take advantage of the exponentially rising calculation
power in new generation processor units. Thus, the number of parameters which are trained …
power in new generation processor units. Thus, the number of parameters which are trained …
Learning sparse networks using targeted dropout
Neural networks are easier to optimise when they have many more weights than are
required for modelling the mapping from inputs to outputs. This suggests a two-stage …
required for modelling the mapping from inputs to outputs. This suggests a two-stage …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Pruning neural networks: is it time to nip it in the bud?
Pruning is a popular technique for compressing a neural network: a large pre-trained
network is fine-tuned while connections are successively removed. However, the value of …
network is fine-tuned while connections are successively removed. However, the value of …
The lottery ticket hypothesis: Finding sparse, trainable neural networks
Neural network pruning techniques can reduce the parameter counts of trained networks by
over 90%, decreasing storage requirements and improving computational performance of …
over 90%, decreasing storage requirements and improving computational performance of …
Lassonet: Neural networks with feature sparsity
I Lemhadri, F Ruan… - … conference on artificial …, 2021 - proceedings.mlr.press
Much work has been done recently to make neural networks more interpretable, and one
approach is to arrange for the network to use only a subset of the available features. In linear …
approach is to arrange for the network to use only a subset of the available features. In linear …
Pruning artificial neural networks using neural complexity measures
TD Jorgensen, BP Haynes… - International Journal of …, 2008 - World Scientific
This paper describes a new method for pruning artificial neural networks, using a measure
of the neural complexity of the neural network. This measure is used to determine the …
of the neural complexity of the neural network. This measure is used to determine the …
Powerpropagation: A sparsity inducing weight reparameterisation
The training of sparse neural networks is becoming an increasingly important tool for
reducing the computational footprint of models at training and evaluation, as well enabling …
reducing the computational footprint of models at training and evaluation, as well enabling …
Frivolous units: Wider networks are not really that wide
A remarkable characteristic of overparameterized deep neural networks (DNNs) is that their
accuracy does not degrade when the network width is increased. Recent evidence suggests …
accuracy does not degrade when the network width is increased. Recent evidence suggests …
[HTML][HTML] Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science
Through the success of deep learning in various domains, artificial neural networks are
currently among the most used artificial intelligence methods. Taking inspiration from the …
currently among the most used artificial intelligence methods. Taking inspiration from the …