Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges
Most machine learning algorithms are configured by a set of hyperparameters whose values
must be carefully chosen and which often considerably impact performance. To avoid a time …
must be carefully chosen and which often considerably impact performance. To avoid a time …
Neuroevolution in deep neural networks: Current trends and future challenges
A variety of methods have been applied to the architectural configuration and learning or
training of artificial deep neural networks (DNN). These methods play a crucial role in the …
training of artificial deep neural networks (DNN). These methods play a crucial role in the …
Autokeras: An automl library for deep learning
To use deep learning, one needs to be familiar with various software tools like TensorFlow
or Keras, as well as various model architecture and optimization best practices. Despite …
or Keras, as well as various model architecture and optimization best practices. Despite …
Neural architecture search: Insights from 1000 papers
In the past decade, advances in deep learning have resulted in breakthroughs in a variety of
areas, including computer vision, natural language understanding, speech recognition, and …
areas, including computer vision, natural language understanding, speech recognition, and …
Bananas: Bayesian optimization with neural architectures for neural architecture search
Over the past half-decade, many methods have been considered for neural architecture
search (NAS). Bayesian optimization (BO), which has long had success in hyperparameter …
search (NAS). Bayesian optimization (BO), which has long had success in hyperparameter …
Variability and reproducibility in deep learning for medical image segmentation
Medical image segmentation is an important tool for current clinical applications. It is the
backbone of numerous clinical diagnosis methods, oncological treatments and computer …
backbone of numerous clinical diagnosis methods, oncological treatments and computer …
Nats-bench: Benchmarking nas algorithms for architecture topology and size
Neural architecture search (NAS) has attracted a lot of attention and has been illustrated to
bring tangible benefits in a large number of applications in the past few years. Architecture …
bring tangible benefits in a large number of applications in the past few years. Architecture …
Auto-pytorch: Multi-fidelity metalearning for efficient and robust autodl
While early AutoML frameworks focused on optimizing traditional ML pipelines and their
hyperparameters, a recent trend in AutoML is to focus on neural architecture search. In this …
hyperparameters, a recent trend in AutoML is to focus on neural architecture search. In this …
[PDF][PDF] Nas-bench-301 and the case for surrogate benchmarks for neural architecture search
ABSTRACT Neural Architecture Search (NAS) is a logical next step in the automatic learning
of representations, but the development of NAS methods is slowed by high computational …
of representations, but the development of NAS methods is slowed by high computational …
How powerful are performance predictors in neural architecture search?
Early methods in the rapidly developing field of neural architecture search (NAS) required
fully training thousands of neural networks. To reduce this extreme computational cost …
fully training thousands of neural networks. To reduce this extreme computational cost …