Model complexity of deep learning: A survey
Abstract Model complexity is a fundamental problem in deep learning. In this paper, we
conduct a systematic overview of the latest studies on model complexity in deep learning …
conduct a systematic overview of the latest studies on model complexity in deep learning …
Mish: A self regularized non-monotonic activation function
D Misra - arXiv preprint arXiv:1908.08681, 2019 - arxiv.org
We propose $\textit {Mish} $, a novel self-regularized non-monotonic activation function
which can be mathematically defined as: $ f (x)= x\tanh (softplus (x)) $. As activation …
which can be mathematically defined as: $ f (x)= x\tanh (softplus (x)) $. As activation …
A deep collocation method for the bending analysis of Kirchhoff plate
In this paper, a deep collocation method (DCM) for thin plate bending problems is proposed.
This method takes advantage of computational graphs and backpropagation algorithms …
This method takes advantage of computational graphs and backpropagation algorithms …
How good is the Bayes posterior in deep neural networks really?
During the past five years the Bayesian deep learning community has developed
increasingly accurate and efficient approximate inference procedures that allow for …
increasingly accurate and efficient approximate inference procedures that allow for …
A novel PCA–whale optimization-based deep neural network model for classification of tomato plant diseases using GPU
The human population is growing at a very rapid scale. With this progressive growth, it is
extremely important to ensure that healthy food is available for the survival of the inhabitants …
extremely important to ensure that healthy food is available for the survival of the inhabitants …
Finite versus infinite neural networks: an empirical study
We perform a careful, thorough, and large scale empirical study of the correspondence
between wide neural networks and kernel methods. By doing so, we resolve a variety of …
between wide neural networks and kernel methods. By doing so, we resolve a variety of …
Neural tangents: Fast and easy infinite neural networks in python
Neural Tangents is a library designed to enable research into infinite-width neural networks.
It provides a high-level API for specifying complex and hierarchical neural network …
It provides a high-level API for specifying complex and hierarchical neural network …
Dynamical isometry and a mean field theory of cnns: How to train 10,000-layer vanilla convolutional neural networks
In recent years, state-of-the-art methods in computer vision have utilized increasingly deep
convolutional neural network architectures (CNNs), with some of the most successful models …
convolutional neural network architectures (CNNs), with some of the most successful models …
Tensor programs ii: Neural tangent kernel for any architecture
G Yang - arXiv preprint arXiv:2006.14548, 2020 - arxiv.org
We prove that a randomly initialized neural network of* any architecture* has its Tangent
Kernel (NTK) converge to a deterministic limit, as the network widths tend to infinity. We …
Kernel (NTK) converge to a deterministic limit, as the network widths tend to infinity. We …
Kanqas: Kolmogorov-arnold network for quantum architecture search
Quantum architecture Search (QAS) is a promising direction for optimization and automated
design of quantum circuits towards quantum advantage. Recent techniques in QAS …
design of quantum circuits towards quantum advantage. Recent techniques in QAS …