Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges

B Bischl, M Binder, M Lang, T Pielok… - … : Data Mining and …, 2023 - Wiley Online Library
Most machine learning algorithms are configured by a set of hyperparameters whose values
must be carefully chosen and which often considerably impact performance. To avoid a time …

Neuroevolution in deep neural networks: Current trends and future challenges

E Galván, P Mooney - IEEE Transactions on Artificial …, 2021 - ieeexplore.ieee.org
A variety of methods have been applied to the architectural configuration and learning or
training of artificial deep neural networks (DNN). These methods play a crucial role in the …

Autokeras: An automl library for deep learning

H Jin, F Chollet, Q Song, X Hu - Journal of machine Learning research, 2023 - jmlr.org
To use deep learning, one needs to be familiar with various software tools like TensorFlow
or Keras, as well as various model architecture and optimization best practices. Despite …

Neural architecture search: Insights from 1000 papers

C White, M Safari, R Sukthanker, B Ru, T Elsken… - arXiv preprint arXiv …, 2023 - arxiv.org
In the past decade, advances in deep learning have resulted in breakthroughs in a variety of
areas, including computer vision, natural language understanding, speech recognition, and …

Bananas: Bayesian optimization with neural architectures for neural architecture search

C White, W Neiswanger, Y Savani - … of the AAAI conference on artificial …, 2021 - ojs.aaai.org
Over the past half-decade, many methods have been considered for neural architecture
search (NAS). Bayesian optimization (BO), which has long had success in hyperparameter …

Variability and reproducibility in deep learning for medical image segmentation

F Renard, S Guedria, ND Palma, N Vuillerme - Scientific Reports, 2020 - nature.com
Medical image segmentation is an important tool for current clinical applications. It is the
backbone of numerous clinical diagnosis methods, oncological treatments and computer …

Nats-bench: Benchmarking nas algorithms for architecture topology and size

X Dong, L Liu, K Musial… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Neural architecture search (NAS) has attracted a lot of attention and has been illustrated to
bring tangible benefits in a large number of applications in the past few years. Architecture …

Auto-pytorch: Multi-fidelity metalearning for efficient and robust autodl

L Zimmer, M Lindauer, F Hutter - IEEE transactions on pattern …, 2021 - ieeexplore.ieee.org
While early AutoML frameworks focused on optimizing traditional ML pipelines and their
hyperparameters, a recent trend in AutoML is to focus on neural architecture search. In this …

[PDF][PDF] Nas-bench-301 and the case for surrogate benchmarks for neural architecture search

J Siems, L Zimmer, A Zela, J Lukasik… - arXiv preprint arXiv …, 2020 - researchgate.net
ABSTRACT Neural Architecture Search (NAS) is a logical next step in the automatic learning
of representations, but the development of NAS methods is slowed by high computational …

How powerful are performance predictors in neural architecture search?

C White, A Zela, R Ru, Y Liu… - Advances in Neural …, 2021 - proceedings.neurips.cc
Early methods in the rapidly developing field of neural architecture search (NAS) required
fully training thousands of neural networks. To reduce this extreme computational cost …