A comparative review of dimension reduction methods in approximate Bayesian computation

MGB Blum, MA Nunes, D Prangle, SA Sisson - 2013 - projecteuclid.org
Supplement to “A Comparative Review of Dimension Reduction Methods in Approximate
Bayesian Computation”. The supplement contains for each of the three examples a …

Nonlinear gated experts for time series: Discovering regimes and avoiding overfitting

AS Weigend, M Mangeas… - International journal of …, 1995 - World Scientific
In the analysis and prediction of real-world systems, two of the key problems are
nonstationarity (often in the form of switching between regimes), and overfitting (particularly …

A survey of active learning for text classification using deep neural networks

C Schröder, A Niekler - arXiv preprint arXiv:2008.07267, 2020 - arxiv.org
Natural language processing (NLP) and neural networks (NNs) have both undergone
significant changes in recent years. For active learning (AL) purposes, NNs are, however …

Non-linear regression models for Approximate Bayesian Computation

MGB Blum, O François - Statistics and computing, 2010 - Springer
Approximate Bayesian inference on the basis of summary statistics is well-suited to complex
problems for which the likelihood is either mathematically or computationally intractable …

[图书][B] Machine learning for spatial environmental data: theory, applications, and software

M Kanevski, V Timonin, A Pozdnukhov - 2009 - taylorfrancis.com
This book discusses machine learning algorithms, such as artificial neural networks of
different architectures, statistical learning theory, and Support Vector Machines used for the …

Confidence estimation methods for neural networks: A practical comparison

G Papadopoulos, PJ Edwards… - IEEE transactions on …, 2001 - ieeexplore.ieee.org
Feedforward neural networks, particularly multilayer perceptrons, are widely used in
regression and classification tasks. A reliable and practical measure of prediction …

Confidence intervals and prediction intervals for feed-forward neural networks

R Dybowski, SJ Roberts - 2001 - repository.uel.ac.uk
The chapter opens with an introduction to regression and its implementation within the
maximum-likelihood framework. This is followed by a general introduction to classical …

Using neural networks to model conditional multivariate densities

PM Williams - Neural computation, 1996 - ieeexplore.ieee.org
Neural network outputs are interpreted as parameters of statistical distributions. This allows
us to fit conditional distributions in which the parameters depend on the inputs to the …

Lyapunov function approach for approximation algorithm design and analysis: with applications in submodular maximization

D Du - arXiv preprint arXiv:2205.12442, 2022 - arxiv.org
We propose a two-phase systematical framework for approximation algorithm design and
analysis via Lyapunov function. The first phase consists of using Lyapunov function as an …

Constructing optimal prediction intervals by using neural networks and bootstrap method

A Khosravi, S Nahavandi, D Srinivasan… - IEEE transactions on …, 2014 - ieeexplore.ieee.org
This brief proposes an efficient technique for the construction of optimized prediction
intervals (PIs) by using the bootstrap technique. The method employs an innovative PI …