A comprehensive survey on pretrained foundation models: A history from bert to chatgpt
Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks with different data modalities. A PFM (eg, BERT, ChatGPT, and GPT-4) is …
downstream tasks with different data modalities. A PFM (eg, BERT, ChatGPT, and GPT-4) is …
A Survey on Self-supervised Learning: Algorithms, Applications, and Future Trends
Deep supervised learning algorithms typically require a large volume of labeled data to
achieve satisfactory performance. However, the process of collecting and labeling such data …
achieve satisfactory performance. However, the process of collecting and labeling such data …
Self-supervised learning from images with a joint-embedding predictive architecture
This paper demonstrates an approach for learning highly semantic image representations
without relying on hand-crafted data-augmentations. We introduce the Image-based Joint …
without relying on hand-crafted data-augmentations. We introduce the Image-based Joint …
A metaverse: Taxonomy, components, applications, and open challenges
SM Park, YG Kim - IEEE access, 2022 - ieeexplore.ieee.org
Unlike previous studies on the Metaverse based on Second Life, the current Metaverse is
based on the social value of Generation Z that online and offline selves are not different …
based on the social value of Generation Z that online and offline selves are not different …
Learn from others and be yourself in heterogeneous federated learning
Federated learning has emerged as an important distributed learning paradigm, which
normally involves collaborative updating with others and local updating on private data …
normally involves collaborative updating with others and local updating on private data …
Barlow twins: Self-supervised learning via redundancy reduction
Self-supervised learning (SSL) is rapidly closing the gap with supervised methods on large
computer vision benchmarks. A successful approach to SSL is to learn embeddings which …
computer vision benchmarks. A successful approach to SSL is to learn embeddings which …
Pre-training molecular graph representation with 3d geometry
Molecular graph representation learning is a fundamental problem in modern drug and
material discovery. Molecular graphs are typically modeled by their 2D topological …
material discovery. Molecular graphs are typically modeled by their 2D topological …
Self-supervised learning in remote sensing: A review
Y Wang, CM Albrecht, NAA Braham… - IEEE Geoscience and …, 2022 - ieeexplore.ieee.org
In deep learning research, self-supervised learning (SSL) has received great attention,
triggering interest within both the computer vision and remote sensing communities. While …
triggering interest within both the computer vision and remote sensing communities. While …
Vicreg: Variance-invariance-covariance regularization for self-supervised learning
Recent self-supervised methods for image representation learning are based on maximizing
the agreement between embedding vectors from different views of the same image. A trivial …
the agreement between embedding vectors from different views of the same image. A trivial …
Contrastive and non-contrastive self-supervised learning recover global and local spectral embedding methods
R Balestriero, Y LeCun - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Abstract Self-Supervised Learning (SSL) surmises that inputs and pairwise positive
relationships are enough to learn meaningful representations. Although SSL has recently …
relationships are enough to learn meaningful representations. Although SSL has recently …