Unsupervised graph neural architecture search with disentangled self-supervision

Z Zhang, X Wang, Z Zhang, G Shen… - Advances in Neural …, 2024 - proceedings.neurips.cc
The existing graph neural architecture search (GNAS) methods heavily rely on supervised
labels during the search process, failing to handle ubiquitous scenarios where supervisions …

Advances in neural architecture search

X Wang, W Zhu - National Science Review, 2024 - academic.oup.com
Automated machine learning (AutoML) has achieved remarkable success in automating the
non-trivial process of designing machine learning models. Among the focal areas of AutoML …

Nasiam: Efficient representation learning using neural architecture search for siamese networks

A Heuillet, H Tabia, H Arioui - Procedia Computer Science, 2023 - Elsevier
Siamese networks are one of the most trending methods to achieve self-supervised visual
representation learning (SSL). Since hand labeling is costly, SSL can play a crucial part by …

Minimizing parameter overhead in self supervised models for target task

J Kishore, S Mukherjee - IEEE Transactions on Artificial …, 2023 - ieeexplore.ieee.org
Supervised deep learning models encounter two major challenges: labeled datasets for
training and parameter overhead, which leads to extensive GPU usage and other …