Toward the third generation artificial intelligence
There have been two competing paradigms in artificial intelligence (AI) development ever
since its birth in 1956, ie, symbolism and connectionism (or sub-symbolism). While …
since its birth in 1956, ie, symbolism and connectionism (or sub-symbolism). While …
A complete survey on generative ai (aigc): Is chatgpt from gpt-4 to gpt-5 all you need?
As ChatGPT goes viral, generative AI (AIGC, aka AI-generated content) has made headlines
everywhere because of its ability to analyze and create text, images, and beyond. With such …
everywhere because of its ability to analyze and create text, images, and beyond. With such …
Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …
the distribution of training samples. Research has fragmented into various interconnected …
Monte carlo gradient estimation in machine learning
This paper is a broad and accessible survey of the methods we have at our disposal for
Monte Carlo gradient estimation in machine learning and across the statistical sciences: the …
Monte Carlo gradient estimation in machine learning and across the statistical sciences: the …
Sliced score matching: A scalable approach to density and score estimation
Score matching is a popular method for estimating unnormalized statistical models.
However, it has been so far limited to simple, shallow models or low-dimensional data, due …
However, it has been so far limited to simple, shallow models or low-dimensional data, due …
Functional variational Bayesian neural networks
Variational Bayesian neural networks (BNNs) perform variational inference over weights, but
it is difficult to specify meaningful priors and approximate posteriors in a high-dimensional …
it is difficult to specify meaningful priors and approximate posteriors in a high-dimensional …
Repulsive deep ensembles are bayesian
F D'Angelo, V Fortuin - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Deep ensembles have recently gained popularity in the deep learning community for their
conceptual simplicity and efficiency. However, maintaining functional diversity between …
conceptual simplicity and efficiency. However, maintaining functional diversity between …
Adversarial distributional training for robust deep learning
Adversarial training (AT) is among the most effective techniques to improve model
robustness by augmenting training data with adversarial examples. However, most existing …
robustness by augmenting training data with adversarial examples. However, most existing …
Maximum mean discrepancy gradient flow
We construct a Wasserstein gradient flow of the maximum mean discrepancy (MMD) and
study its convergence properties. The MMD is an integral probability metric defined for a …
study its convergence properties. The MMD is an integral probability metric defined for a …