Large language models in medicine

AJ Thirunavukarasu, DSJ Ting, K Elangovan… - Nature medicine, 2023 - nature.com
Large language models (LLMs) can respond to free-text queries without being specifically
trained in the task in question, causing excitement and concern about their use in healthcare …

[HTML][HTML] ChatGPT in healthcare: a taxonomy and systematic review

J Li, A Dada, B Puladi, J Kleesiek, J Egger - Computer Methods and …, 2024 - Elsevier
The recent release of ChatGPT, a chat bot research project/product of natural language
processing (NLP) by OpenAI, stirs up a sensation among both the general public and …

Llava-med: Training a large language-and-vision assistant for biomedicine in one day

C Li, C Wong, S Zhang, N Usuyama… - Advances in …, 2024 - proceedings.neurips.cc
Conversational generative AI has demonstrated remarkable promise for empowering
biomedical practitioners, but current investigations focus on unimodal text. Multimodal …

ChatGPT for shaping the future of dentistry: the potential of multi-modal large language model

H Huang, O Zheng, D Wang, J Yin, Z Wang… - International Journal of …, 2023 - nature.com
The ChatGPT, a lite and conversational variant of Generative Pretrained Transformer 4 (GPT-
4) developed by OpenAI, is one of the milestone Large Language Models (LLMs) with …

Shifting machine learning for healthcare from development to deployment and from models to data

A Zhang, L Xing, J Zou, JC Wu - Nature Biomedical Engineering, 2022 - nature.com
In the past decade, the application of machine learning (ML) to healthcare has helped drive
the automation of physician tasks as well as enhancements in clinical capabilities and …

A study of generative large language model for medical research and healthcare

C Peng, X Yang, A Chen, KE Smith… - NPJ digital …, 2023 - nature.com
There are enormous enthusiasm and concerns in applying large language models (LLMs) to
healthcare. Yet current assumptions are based on general-purpose LLMs such as ChatGPT …

A survey on vision transformer

K Han, Y Wang, H Chen, X Chen, J Guo… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Transformer, first applied to the field of natural language processing, is a type of deep neural
network mainly based on the self-attention mechanism. Thanks to its strong representation …

[HTML][HTML] Deep Learning applications for COVID-19

C Shorten, TM Khoshgoftaar, B Furht - Journal of big Data, 2021 - Springer
This survey explores how Deep Learning has battled the COVID-19 pandemic and provides
directions for future research on COVID-19. We cover Deep Learning applications in Natural …

Don't stop pretraining: Adapt language models to domains and tasks

S Gururangan, A Marasović, S Swayamdipta… - arXiv preprint arXiv …, 2020 - arxiv.org
Language models pretrained on text from a wide variety of sources form the foundation of
today's NLP. In light of the success of these broad-coverage models, we investigate whether …

Mentalbert: Publicly available pretrained language models for mental healthcare

S Ji, T Zhang, L Ansari, J Fu, P Tiwari… - arXiv preprint arXiv …, 2021 - arxiv.org
Mental health is a critical issue in modern society, and mental disorders could sometimes
turn to suicidal ideation without adequate treatment. Early detection of mental disorders and …