Deep learning modelling techniques: current progress, applications, advantages, and challenges
Deep learning (DL) is revolutionizing evidence-based decision-making techniques that can
be applied across various sectors. Specifically, it possesses the ability to utilize two or more …
be applied across various sectors. Specifically, it possesses the ability to utilize two or more …
Large-scale multi-modal pre-trained models: A comprehensive survey
With the urgent demand for generalized deep models, many pre-trained big models are
proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT) …
proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT) …
Lamda: Language models for dialog applications
We present LaMDA: Language Models for Dialog Applications. LaMDA is a family of
Transformer-based neural language models specialized for dialog, which have up to 137B …
Transformer-based neural language models specialized for dialog, which have up to 137B …
Glam: Efficient scaling of language models with mixture-of-experts
Scaling language models with more data, compute and parameters has driven significant
progress in natural language processing. For example, thanks to scaling, GPT-3 was able to …
progress in natural language processing. For example, thanks to scaling, GPT-3 was able to …
Text and code embeddings by contrastive pre-training
Text embeddings are useful features in many applications such as semantic search and
computing text similarity. Previous work typically trains models customized for different use …
computing text similarity. Previous work typically trains models customized for different use …
One embedder, any task: Instruction-finetuned text embeddings
We introduce INSTRUCTOR, a new method for computing text embeddings given task
instructions: every text input is embedded together with instructions explaining the use case …
instructions: every text input is embedded together with instructions explaining the use case …
Simcse: Simple contrastive learning of sentence embeddings
This paper presents SimCSE, a simple contrastive learning framework that greatly advances
state-of-the-art sentence embeddings. We first describe an unsupervised approach, which …
state-of-the-art sentence embeddings. We first describe an unsupervised approach, which …
Consert: A contrastive framework for self-supervised sentence representation transfer
Learning high-quality sentence representations benefits a wide range of natural language
processing tasks. Though BERT-based pre-trained language models achieve high …
processing tasks. Though BERT-based pre-trained language models achieve high …
An introduction to deep learning in natural language processing: Models, techniques, and tools
Abstract Natural Language Processing (NLP) is a branch of artificial intelligence that
involves the design and implementation of systems and algorithms able to interact through …
involves the design and implementation of systems and algorithms able to interact through …
Learning transferable visual models from natural language supervision
State-of-the-art computer vision systems are trained to predict a fixed set of predetermined
object categories. This restricted form of supervision limits their generality and usability since …
object categories. This restricted form of supervision limits their generality and usability since …