Machine learning and deep learning—A review for ecologists
The popularity of machine learning (ML), deep learning (DL) and artificial intelligence (AI)
has risen sharply in recent years. Despite this spike in popularity, the inner workings of ML …
has risen sharply in recent years. Despite this spike in popularity, the inner workings of ML …
A farewell to the bias-variance tradeoff? an overview of the theory of overparameterized machine learning
The rapid recent progress in machine learning (ML) has raised a number of scientific
questions that challenge the longstanding dogma of the field. One of the most important …
questions that challenge the longstanding dogma of the field. One of the most important …
Robust training under label noise by over-parameterization
Recently, over-parameterized deep networks, with increasingly more network parameters
than training samples, have dominated the performances of modern machine learning …
than training samples, have dominated the performances of modern machine learning …
An investigation of why overparameterization exacerbates spurious correlations
S Sagawa, A Raghunathan… - … on Machine Learning, 2020 - proceedings.mlr.press
We study why overparameterization—increasing model size well beyond the point of zero
training error—can hurt test error on minority groups despite improving average test error …
training error—can hurt test error on minority groups despite improving average test error …
On the optimization landscape of neural collapse under mse loss: Global optimality with unconstrained features
When training deep neural networks for classification tasks, an intriguing empirical
phenomenon has been widely observed in the last-layer classifiers and features, where (i) …
phenomenon has been widely observed in the last-layer classifiers and features, where (i) …
Ensemble of averages: Improving model selection and boosting performance in domain generalization
Abstract In Domain Generalization (DG) settings, models trained independently on a given
set of training domains have notoriously chaotic performance on distribution shifted test …
set of training domains have notoriously chaotic performance on distribution shifted test …
A geometric analysis of neural collapse with unconstrained features
We provide the first global optimization landscape analysis of Neural Collapse--an intriguing
empirical phenomenon that arises in the last-layer classifiers and features of neural …
empirical phenomenon that arises in the last-layer classifiers and features of neural …
A gentle introduction to reinforcement learning and its application in different fields
Due to the recent progress in Deep Neural Networks, Reinforcement Learning (RL) has
become one of the most important and useful technology. It is a learning method where a …
become one of the most important and useful technology. It is a learning method where a …
A stacking ensemble deep learning approach to cancer type classification based on TCGA data
Cancer tumor classification based on morphological characteristics alone has been shown
to have serious limitations. Breast, lung, colorectal, thyroid, and ovarian are the most …
to have serious limitations. Breast, lung, colorectal, thyroid, and ovarian are the most …
ReduNet: A white-box deep network from the principle of maximizing rate reduction
This work attempts to provide a plausible theoretical framework that aims to interpret modern
deep (convolutional) networks from the principles of data compression and discriminative …
deep (convolutional) networks from the principles of data compression and discriminative …