On hyperparameter optimization of machine learning algorithms: Theory and practice

L Yang, A Shami - Neurocomputing, 2020 - Elsevier
Abstract Machine learning algorithms have been used widely in various applications and
areas. To fit a machine learning model into different problems, its hyper-parameters must be …

Artificial neural networks-based machine learning for wireless networks: A tutorial

M Chen, U Challita, W Saad, C Yin… - … Surveys & Tutorials, 2019 - ieeexplore.ieee.org
In order to effectively provide ultra reliable low latency communications and pervasive
connectivity for Internet of Things (IoT) devices, next-generation wireless networks can …

Transformers as statisticians: Provable in-context learning with in-context algorithm selection

Y Bai, F Chen, H Wang, C Xiong… - Advances in neural …, 2024 - proceedings.neurips.cc
Neural sequence models based on the transformer architecture have demonstrated
remarkable\emph {in-context learning}(ICL) abilities, where they can perform new tasks …

Training compute-optimal large language models

J Hoffmann, S Borgeaud, A Mensch… - arXiv preprint arXiv …, 2022 - arxiv.org
We investigate the optimal model size and number of tokens for training a transformer
language model under a given compute budget. We find that current large language models …

An empirical analysis of compute-optimal large language model training

J Hoffmann, S Borgeaud, A Mensch… - Advances in …, 2022 - proceedings.neurips.cc
We investigate the optimal model size and number of tokens for training a transformer
language model under a given compute budget. We find that current large language models …

Federated multi-task learning under a mixture of distributions

O Marfoq, G Neglia, A Bellet… - Advances in Neural …, 2021 - proceedings.neurips.cc
The increasing size of data generated by smartphones and IoT devices motivated the
development of Federated Learning (FL), a framework for on-device collaborative training of …

Personalized federated learning with moreau envelopes

CT Dinh, N Tran, J Nguyen - Advances in neural …, 2020 - proceedings.neurips.cc
Federated learning (FL) is a decentralized and privacy-preserving machine learning
technique in which a group of clients collaborate with a server to learn a global model …

Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation

M Belkin - Acta Numerica, 2021 - cambridge.org
In the past decade the mathematical theory of machine learning has lagged far behind the
triumphs of deep neural networks on practical challenges. However, the gap between theory …

[PDF][PDF] Nash learning from human feedback

R Munos, M Valko, D Calandriello, MG Azar… - arXiv preprint arXiv …, 2023 - ai-plans.com
Large language models (LLMs)(Anil et al., 2023; Glaese et al., 2022; OpenAI, 2023; Ouyang
et al., 2022) have made remarkable strides in enhancing natural language understanding …

Meta-learning with implicit gradients

A Rajeswaran, C Finn, SM Kakade… - Advances in neural …, 2019 - proceedings.neurips.cc
A core capability of intelligent systems is the ability to quickly learn new tasks by drawing on
prior experience. Gradient (or optimization) based meta-learning has recently emerged as …