[HTML][HTML] The shaky foundations of large language models and foundation models for electronic health records
The success of foundation models such as ChatGPT and AlphaFold has spurred significant
interest in building similar models for electronic medical records (EMRs) to improve patient …
interest in building similar models for electronic medical records (EMRs) to improve patient …
Pre-trained language models in biomedical domain: A systematic survey
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …
language processing tasks. This also benefits the biomedical domain: researchers from …
A survey of large language models for healthcare: from data, technology, and applications to accountability and ethics
The utilization of large language models (LLMs) in the Healthcare domain has generated
both excitement and concern due to their ability to effectively respond to freetext queries with …
both excitement and concern due to their ability to effectively respond to freetext queries with …
A comparative study of pretrained language models for long clinical text
Objective Clinical knowledge-enriched transformer models (eg, ClinicalBERT) have state-of-
the-art results on clinical natural language processing (NLP) tasks. One of the core …
the-art results on clinical natural language processing (NLP) tasks. One of the core …
A review on large Language Models: Architectures, applications, taxonomies, open issues and challenges
Large Language Models (LLMs) recently demonstrated extraordinary capability in various
natural language processing (NLP) tasks including language translation, text generation …
natural language processing (NLP) tasks including language translation, text generation …
[HTML][HTML] AMMU: a survey of transformer-based biomedical pretrained language models
KS Kalyan, A Rajasekharan, S Sangeetha - Journal of biomedical …, 2022 - Elsevier
Transformer-based pretrained language models (PLMs) have started a new era in modern
natural language processing (NLP). These models combine the power of transformers …
natural language processing (NLP). These models combine the power of transformers …
In-boxbart: Get instructions into biomedical multi-task learning
Single-task models have proven pivotal in solving specific tasks; however, they have
limitations in real-world applications where multi-tasking is necessary and domain shifts are …
limitations in real-world applications where multi-tasking is necessary and domain shifts are …
[HTML][HTML] Hierarchical label-wise attention transformer model for explainable ICD coding
Abstract International Classification of Diseases (ICD) coding plays an important role in
systematically classifying morbidity and mortality data. In this study, we propose a …
systematically classifying morbidity and mortality data. In this study, we propose a …
Transformers in healthcare: A survey
With Artificial Intelligence (AI) increasingly permeating various aspects of society, including
healthcare, the adoption of the Transformers neural network architecture is rapidly changing …
healthcare, the adoption of the Transformers neural network architecture is rapidly changing …
Gpt-3 models are poor few-shot learners in the biomedical domain
Deep neural language models have set new breakthroughs in many tasks of Natural
Language Processing (NLP). Recent work has shown that deep transformer language …
Language Processing (NLP). Recent work has shown that deep transformer language …