Randomness in neural networks: an overview
S Scardapane, D Wang - Wiley Interdisciplinary Reviews: Data …, 2017 - Wiley Online Library
Neural networks, as powerful tools for data mining and knowledge engineering, can learn
from data to build feature‐based classifiers and nonlinear predictive models. Training neural …
from data to build feature‐based classifiers and nonlinear predictive models. Training neural …
In search of the real inductive bias: On the role of implicit regularization in deep learning
We present experiments demonstrating that some other form of capacity control, different
from network size, plays a central role in learning multilayer feed-forward networks. We …
from network size, plays a central role in learning multilayer feed-forward networks. We …
Deep neural networks with random gaussian weights: A universal classification strategy?
Three important properties of a classification machinery are i) the system preserves the core
information of the input data; ii) the training examples convey information about unseen …
information of the input data; ii) the training examples convey information about unseen …
Faster kernel ridge regression using sketching and preconditioning
Kernel ridge regression is a simple yet powerful technique for nonparametric regression
whose computation amounts to solving a linear system. This system is usually dense and …
whose computation amounts to solving a linear system. This system is usually dense and …
A survey of modern questions and challenges in feature extraction
D Storcheus, A Rostamizadeh… - … : Modern Questions and …, 2015 - proceedings.mlr.press
The problem of extracting features from given data is of critical importance for successful
application of machine learning. Feature extraction, as usually understood, seeks an optimal …
application of machine learning. Feature extraction, as usually understood, seeks an optimal …
Steps toward deep kernel methods from infinite neural networks
T Hazan, T Jaakkola - arXiv preprint arXiv:1508.05133, 2015 - arxiv.org
Contemporary deep neural networks exhibit impressive results on practical problems. These
networks generalize well although their inherent capacity may extend significantly beyond …
networks generalize well although their inherent capacity may extend significantly beyond …
Diving into the shallows: a computational perspective on large-scale shallow learning
Remarkable recent success of deep neural networks has not been easy to analyze
theoretically. It has been particularly hard to disentangle relative significance of architecture …
theoretically. It has been particularly hard to disentangle relative significance of architecture …
Sparse Hilbert Schmidt independence criterion and surrogate-kernel-based feature selection for hyperspectral image classification
BB Damodaran, N Courty… - IEEE Transactions on …, 2017 - ieeexplore.ieee.org
Designing an effective criterion to select a subset of features is a challenging problem for
hyperspectral image classification. In this paper, we develop a feature selection method to …
hyperspectral image classification. In this paper, we develop a feature selection method to …
Bayesian optimization with tree-structured dependencies
R Jenatton, C Archambeau… - International …, 2017 - proceedings.mlr.press
Bayesian optimization has been successfully used to optimize complex black-box functions
whose evaluations are expensive. In many applications, like in deep learning and predictive …
whose evaluations are expensive. In many applications, like in deep learning and predictive …
Bayesian nonparametric kernel-learning
Kernel methods are ubiquitous tools in machine learning. They have proven to be effective
in many domains and tasks. Yet, kernel methods often require the user to select a …
in many domains and tasks. Yet, kernel methods often require the user to select a …