[HTML][HTML] The shaky foundations of large language models and foundation models for electronic health records

M Wornow, Y Xu, R Thapa, B Patel, E Steinberg… - npj Digital …, 2023 - nature.com
The success of foundation models such as ChatGPT and AlphaFold has spurred significant
interest in building similar models for electronic medical records (EMRs) to improve patient …

Pre-trained language models in biomedical domain: A systematic survey

B Wang, Q Xie, J Pei, Z Chen, P Tiwari, Z Li… - ACM Computing …, 2023 - dl.acm.org
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …

A survey of large language models for healthcare: from data, technology, and applications to accountability and ethics

K He, R Mao, Q Lin, Y Ruan, X Lan, M Feng… - arXiv preprint arXiv …, 2023 - arxiv.org
The utilization of large language models (LLMs) in the Healthcare domain has generated
both excitement and concern due to their ability to effectively respond to freetext queries with …

A comparative study of pretrained language models for long clinical text

Y Li, RM Wehbe, FS Ahmad, H Wang… - Journal of the American …, 2023 - academic.oup.com
Objective Clinical knowledge-enriched transformer models (eg, ClinicalBERT) have state-of-
the-art results on clinical natural language processing (NLP) tasks. One of the core …

A review on large Language Models: Architectures, applications, taxonomies, open issues and challenges

MAK Raiaan, MSH Mukta, K Fatema, NM Fahad… - IEEE …, 2024 - ieeexplore.ieee.org
Large Language Models (LLMs) recently demonstrated extraordinary capability in various
natural language processing (NLP) tasks including language translation, text generation …

[HTML][HTML] AMMU: a survey of transformer-based biomedical pretrained language models

KS Kalyan, A Rajasekharan, S Sangeetha - Journal of biomedical …, 2022 - Elsevier
Transformer-based pretrained language models (PLMs) have started a new era in modern
natural language processing (NLP). These models combine the power of transformers …

In-boxbart: Get instructions into biomedical multi-task learning

M Parmar, S Mishra, M Purohit, M Luo… - arXiv preprint arXiv …, 2022 - arxiv.org
Single-task models have proven pivotal in solving specific tasks; however, they have
limitations in real-world applications where multi-tasking is necessary and domain shifts are …

[HTML][HTML] Hierarchical label-wise attention transformer model for explainable ICD coding

L Liu, O Perez-Concha, A Nguyen, V Bennett… - Journal of biomedical …, 2022 - Elsevier
Abstract International Classification of Diseases (ICD) coding plays an important role in
systematically classifying morbidity and mortality data. In this study, we propose a …

Transformers in healthcare: A survey

S Nerella, S Bandyopadhyay, J Zhang… - arXiv preprint arXiv …, 2023 - arxiv.org
With Artificial Intelligence (AI) increasingly permeating various aspects of society, including
healthcare, the adoption of the Transformers neural network architecture is rapidly changing …

Gpt-3 models are poor few-shot learners in the biomedical domain

M Moradi, K Blagec, F Haberl, M Samwald - arXiv preprint arXiv …, 2021 - arxiv.org
Deep neural language models have set new breakthroughs in many tasks of Natural
Language Processing (NLP). Recent work has shown that deep transformer language …