Second opinion needed: communicating uncertainty in medical machine learning

B Kompa, J Snoek, AL Beam - NPJ Digital Medicine, 2021 - nature.com
There is great excitement that medical artificial intelligence (AI) based on machine learning
(ML) can be used to improve decision making at the patient level in a variety of healthcare …

Uncertainty quantification in machine learning for engineering design and health prognostics: A tutorial

V Nemani, L Biggio, X Huan, Z Hu, O Fink… - … Systems and Signal …, 2023 - Elsevier
On top of machine learning (ML) models, uncertainty quantification (UQ) functions as an
essential layer of safety assurance that could lead to more principled decision making by …

Resmlp: Feedforward networks for image classification with data-efficient training

H Touvron, P Bojanowski, M Caron… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
We present ResMLP, an architecture built entirely upon multi-layer perceptrons for image
classification. It is a simple residual network that alternates (i) a linear layer in which image …

Dataset distillation with infinitely wide convolutional networks

T Nguyen, R Novak, L Xiao… - Advances in Neural …, 2021 - proceedings.neurips.cc
The effectiveness of machine learning algorithms arises from being able to extract useful
features from large amounts of data. As model and dataset sizes increase, dataset …

[PDF][PDF] The computational limits of deep learning

NC Thompson, K Greenewald, K Lee… - arXiv preprint arXiv …, 2020 - assets.pubpub.org
Deep learning's recent history has been one of achievement: from triumphing over humans
in the game of Go to world-leading performance in image classification, voice recognition …

Similarity of neural network representations revisited

S Kornblith, M Norouzi, H Lee… - … conference on machine …, 2019 - proceedings.mlr.press
Recent work has sought to understand the behavior of neural networks by comparing
representations between layers and between different trained models. We examine methods …

On exact computation with an infinitely wide neural net

S Arora, SS Du, W Hu, Z Li… - Advances in neural …, 2019 - proceedings.neurips.cc
How well does a classic deep net architecture like AlexNet or VGG19 classify on a standard
dataset such as CIFAR-10 when its “width”—namely, number of channels in convolutional …

The generalization error of random features regression: Precise asymptotics and the double descent curve

S Mei, A Montanari - Communications on Pure and Applied …, 2022 - Wiley Online Library
Deep learning methods operate in regimes that defy the traditional statistical mindset.
Neural network architectures often contain more parameters than training samples, and are …

Wide neural networks of any depth evolve as linear models under gradient descent

J Lee, L Xiao, S Schoenholz, Y Bahri… - Advances in neural …, 2019 - proceedings.neurips.cc
A longstanding goal in deep learning research has been to precisely characterize training
and generalization. However, the often complex loss landscapes of neural networks have …

Efficient dataset distillation using random feature approximation

N Loo, R Hasani, A Amini… - Advances in Neural …, 2022 - proceedings.neurips.cc
Dataset distillation compresses large datasets into smaller synthetic coresets which retain
performance with the aim of reducing the storage and computational burden of processing …