Pre-trained language models in biomedical domain: A systematic survey
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …
language processing tasks. This also benefits the biomedical domain: researchers from …
A comprehensive survey on word representation models: From classical to state-of-the-art word representation language models
Word representation has always been an important research area in the history of natural
language processing (NLP). Understanding such complex text data is imperative, given that …
language processing (NLP). Understanding such complex text data is imperative, given that …
COVIDSenti: A large-scale benchmark Twitter data set for COVID-19 sentiment analysis
Social media (and the world at large) have been awash with news of the COVID-19
pandemic. With the passage of time, news and awareness about COVID-19 spread like the …
pandemic. With the passage of time, news and awareness about COVID-19 spread like the …
A survey of large language models for healthcare: from data, technology, and applications to accountability and ethics
The utilization of large language models (LLMs) in the Healthcare domain has generated
both excitement and concern due to their ability to effectively respond to freetext queries with …
both excitement and concern due to their ability to effectively respond to freetext queries with …
Neural natural language processing for unstructured data in electronic health records: a review
Electronic health records (EHRs), digital collections of patient healthcare events and
observations, are ubiquitous in medicine and critical to healthcare delivery, operations, and …
observations, are ubiquitous in medicine and critical to healthcare delivery, operations, and …
[HTML][HTML] AMMU: a survey of transformer-based biomedical pretrained language models
KS Kalyan, A Rajasekharan, S Sangeetha - Journal of biomedical …, 2022 - Elsevier
Transformer-based pretrained language models (PLMs) have started a new era in modern
natural language processing (NLP). These models combine the power of transformers …
natural language processing (NLP). These models combine the power of transformers …
Foundation and large language models: fundamentals, challenges, opportunities, and social impacts
D Myers, R Mohawesh, VI Chellaboina, AL Sathvik… - Cluster …, 2024 - Springer
Abstract Foundation and Large Language Models (FLLMs) are models that are trained using
a massive amount of data with the intent to perform a variety of downstream tasks. FLLMs …
a massive amount of data with the intent to perform a variety of downstream tasks. FLLMs …
Vesper: A compact and effective pretrained model for speech emotion recognition
This paper presents a paradigm that adapts general large-scale pretrained models (PTMs)
to speech emotion recognition task. Although PTMs shed new light on artificial general …
to speech emotion recognition task. Although PTMs shed new light on artificial general …
A comparative analysis of active learning for biomedical text mining
An enormous amount of clinical free-text information, such as pathology reports, progress
reports, clinical notes and discharge summaries have been collected at hospitals and …
reports, clinical notes and discharge summaries have been collected at hospitals and …
Cybert: Cybersecurity claim classification by fine-tuning the bert language model
We introduce CyBERT, a cybersecurity feature claims classifier based on bidirectional
encoder representations from transformers and a key component in our semi-automated …
encoder representations from transformers and a key component in our semi-automated …