Pre-trained language models in biomedical domain: A systematic survey

B Wang, Q Xie, J Pei, Z Chen, P Tiwari, Z Li… - ACM Computing …, 2023 - dl.acm.org
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …

A comprehensive survey on word representation models: From classical to state-of-the-art word representation language models

U Naseem, I Razzak, SK Khan, M Prasad - Transactions on Asian and …, 2021 - dl.acm.org
Word representation has always been an important research area in the history of natural
language processing (NLP). Understanding such complex text data is imperative, given that …

COVIDSenti: A large-scale benchmark Twitter data set for COVID-19 sentiment analysis

U Naseem, I Razzak, M Khushi… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Social media (and the world at large) have been awash with news of the COVID-19
pandemic. With the passage of time, news and awareness about COVID-19 spread like the …

A survey of large language models for healthcare: from data, technology, and applications to accountability and ethics

K He, R Mao, Q Lin, Y Ruan, X Lan, M Feng… - arXiv preprint arXiv …, 2023 - arxiv.org
The utilization of large language models (LLMs) in the Healthcare domain has generated
both excitement and concern due to their ability to effectively respond to freetext queries with …

Neural natural language processing for unstructured data in electronic health records: a review

I Li, J Pan, J Goldwasser, N Verma, WP Wong… - Computer Science …, 2022 - Elsevier
Electronic health records (EHRs), digital collections of patient healthcare events and
observations, are ubiquitous in medicine and critical to healthcare delivery, operations, and …

[HTML][HTML] AMMU: a survey of transformer-based biomedical pretrained language models

KS Kalyan, A Rajasekharan, S Sangeetha - Journal of biomedical …, 2022 - Elsevier
Transformer-based pretrained language models (PLMs) have started a new era in modern
natural language processing (NLP). These models combine the power of transformers …

Foundation and large language models: fundamentals, challenges, opportunities, and social impacts

D Myers, R Mohawesh, VI Chellaboina, AL Sathvik… - Cluster …, 2024 - Springer
Abstract Foundation and Large Language Models (FLLMs) are models that are trained using
a massive amount of data with the intent to perform a variety of downstream tasks. FLLMs …

Vesper: A compact and effective pretrained model for speech emotion recognition

W Chen, X Xing, P Chen, X Xu - IEEE Transactions on Affective …, 2024 - ieeexplore.ieee.org
This paper presents a paradigm that adapts general large-scale pretrained models (PTMs)
to speech emotion recognition task. Although PTMs shed new light on artificial general …

A comparative analysis of active learning for biomedical text mining

U Naseem, M Khushi, SK Khan, K Shaukat… - Applied System …, 2021 - mdpi.com
An enormous amount of clinical free-text information, such as pathology reports, progress
reports, clinical notes and discharge summaries have been collected at hospitals and …

Cybert: Cybersecurity claim classification by fine-tuning the bert language model

K Ameri, M Hempel, H Sharif, J Lopez Jr… - Journal of Cybersecurity …, 2021 - mdpi.com
We introduce CyBERT, a cybersecurity feature claims classifier based on bidirectional
encoder representations from transformers and a key component in our semi-automated …