Bayesian statistics and modelling

R van de Schoot, S Depaoli, R King, B Kramer… - Nature Reviews …, 2021 - nature.com
Bayesian statistics is an approach to data analysis based on Bayes' theorem, where
available knowledge about parameters in a statistical model is updated with the information …

Priors in bayesian deep learning: A review

V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …

What can transformers learn in-context? a case study of simple function classes

S Garg, D Tsipras, PS Liang… - Advances in Neural …, 2022 - proceedings.neurips.cc
In-context learning is the ability of a model to condition on a prompt sequence consisting of
in-context examples (input-output pairs corresponding to some task) along with a new query …

Diffusion with forward models: Solving stochastic inverse problems without direct supervision

A Tewari, T Yin, G Cazenavette… - Advances in …, 2023 - proceedings.neurips.cc
Denoising diffusion models are a powerful type of generative models used to capture
complex distributions of real-world signals. However, their applicability is limited to …

Implicit neural representations with periodic activation functions

V Sitzmann, J Martel, A Bergman… - Advances in neural …, 2020 - proceedings.neurips.cc
Implicitly defined, continuous, differentiable signal representations parameterized by neural
networks have emerged as a powerful paradigm, offering many possible benefits over …

From data to functa: Your data point is a function and you can treat it like one

E Dupont, H Kim, SM Eslami, D Rezende… - arXiv preprint arXiv …, 2022 - arxiv.org
It is common practice in deep learning to represent a measurement of the world on a
discrete grid, eg a 2D grid of pixels. However, the underlying signal represented by these …

Set transformer: A framework for attention-based permutation-invariant neural networks

J Lee, Y Lee, J Kim, A Kosiorek… - … on machine learning, 2019 - proceedings.mlr.press
Many machine learning tasks such as multiple instance learning, 3D shape recognition, and
few-shot image classification are defined on sets of instances. Since solutions to such …

Card: Classification and regression diffusion models

X Han, H Zheng, M Zhou - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Learning the distribution of a continuous or categorical response variable y given its
covariates x is a fundamental problem in statistics and machine learning. Deep neural …

Transformer neural processes: Uncertainty-aware meta learning via sequence modeling

T Nguyen, A Grover - arXiv preprint arXiv:2207.04179, 2022 - arxiv.org
Neural Processes (NPs) are a popular class of approaches for meta-learning. Similar to
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …

Metasdf: Meta-learning signed distance functions

V Sitzmann, E Chan, R Tucker… - Advances in …, 2020 - proceedings.neurips.cc
Neural implicit shape representations are an emerging paradigm that offers many potential
benefits over conventional discrete representations, including memory efficiency at a high …