Toward the third generation artificial intelligence

B Zhang, J Zhu, H Su - Science China Information Sciences, 2023 - Springer
There have been two competing paradigms in artificial intelligence (AI) development ever
since its birth in 1956, ie, symbolism and connectionism (or sub-symbolism). While …

A complete survey on generative ai (aigc): Is chatgpt from gpt-4 to gpt-5 all you need?

C Zhang, C Zhang, S Zheng, Y Qiao, C Li… - arXiv preprint arXiv …, 2023 - arxiv.org
As ChatGPT goes viral, generative AI (AIGC, aka AI-generated content) has made headlines
everywhere because of its ability to analyze and create text, images, and beyond. With such …

Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models

S Bond-Taylor, A Leach, Y Long… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep generative models are a class of techniques that train deep neural networks to model
the distribution of training samples. Research has fragmented into various interconnected …

Monte carlo gradient estimation in machine learning

S Mohamed, M Rosca, M Figurnov, A Mnih - Journal of Machine Learning …, 2020 - jmlr.org
This paper is a broad and accessible survey of the methods we have at our disposal for
Monte Carlo gradient estimation in machine learning and across the statistical sciences: the …

Sliced score matching: A scalable approach to density and score estimation

Y Song, S Garg, J Shi, S Ermon - Uncertainty in Artificial …, 2020 - proceedings.mlr.press
Score matching is a popular method for estimating unnormalized statistical models.
However, it has been so far limited to simple, shallow models or low-dimensional data, due …

[PDF][PDF] 迈向第三代人工智能

张钹, 朱军, 苏航 - 中国科学: 信息科学, 2020 - ansafe.xust.edu.cn
摘要人工智能(artificial intelligence, AI) 自1956 年诞生以来, 在60 多年的发展历史中,
一直存在两个相互竞争的范式, 即符号主义与连接主义(或称亚符号主义). 二者虽然同时起步 …

Functional variational Bayesian neural networks

S Sun, G Zhang, J Shi, R Grosse - arXiv preprint arXiv:1903.05779, 2019 - arxiv.org
Variational Bayesian neural networks (BNNs) perform variational inference over weights, but
it is difficult to specify meaningful priors and approximate posteriors in a high-dimensional …

Repulsive deep ensembles are bayesian

F D'Angelo, V Fortuin - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Deep ensembles have recently gained popularity in the deep learning community for their
conceptual simplicity and efficiency. However, maintaining functional diversity between …

Adversarial distributional training for robust deep learning

Y Dong, Z Deng, T Pang, J Zhu… - Advances in Neural …, 2020 - proceedings.neurips.cc
Adversarial training (AT) is among the most effective techniques to improve model
robustness by augmenting training data with adversarial examples. However, most existing …

Maximum mean discrepancy gradient flow

M Arbel, A Korba, A Salim… - Advances in Neural …, 2019 - proceedings.neurips.cc
We construct a Wasserstein gradient flow of the maximum mean discrepancy (MMD) and
study its convergence properties. The MMD is an integral probability metric defined for a …