Unsupervised graph neural architecture search with disentangled self-supervision
The existing graph neural architecture search (GNAS) methods heavily rely on supervised
labels during the search process, failing to handle ubiquitous scenarios where supervisions …
labels during the search process, failing to handle ubiquitous scenarios where supervisions …
Advances in neural architecture search
Automated machine learning (AutoML) has achieved remarkable success in automating the
non-trivial process of designing machine learning models. Among the focal areas of AutoML …
non-trivial process of designing machine learning models. Among the focal areas of AutoML …
Nasiam: Efficient representation learning using neural architecture search for siamese networks
Siamese networks are one of the most trending methods to achieve self-supervised visual
representation learning (SSL). Since hand labeling is costly, SSL can play a crucial part by …
representation learning (SSL). Since hand labeling is costly, SSL can play a crucial part by …
Minimizing parameter overhead in self supervised models for target task
J Kishore, S Mukherjee - IEEE Transactions on Artificial …, 2023 - ieeexplore.ieee.org
Supervised deep learning models encounter two major challenges: labeled datasets for
training and parameter overhead, which leads to extensive GPU usage and other …
training and parameter overhead, which leads to extensive GPU usage and other …