Shifting machine learning for healthcare from development to deployment and from models to data

A Zhang, L Xing, J Zou, JC Wu - Nature Biomedical Engineering, 2022 - nature.com
In the past decade, the application of machine learning (ML) to healthcare has helped drive
the automation of physician tasks as well as enhancements in clinical capabilities and …

Deep learning modelling techniques: current progress, applications, advantages, and challenges

SF Ahmed, MSB Alam, M Hassan, MR Rozbu… - Artificial Intelligence …, 2023 - Springer
Deep learning (DL) is revolutionizing evidence-based decision-making techniques that can
be applied across various sectors. Specifically, it possesses the ability to utilize two or more …

Domain-specific language model pretraining for biomedical natural language processing

Y Gu, R Tinn, H Cheng, M Lucas, N Usuyama… - ACM Transactions on …, 2021 - dl.acm.org
Pretraining large neural language models, such as BERT, has led to impressive gains on
many natural language processing (NLP) tasks. However, most pretraining efforts focus on …

Publicly available clinical BERT embeddings

E Alsentzer, JR Murphy, W Boag, WH Weng… - arXiv preprint arXiv …, 2019 - arxiv.org
Contextual word embedding models such as ELMo (Peters et al., 2018) and BERT (Devlin et
al., 2018) have dramatically improved performance for many natural language processing …

Transfer learning in biomedical natural language processing: an evaluation of BERT and ELMo on ten benchmarking datasets

Y Peng, S Yan, Z Lu - arXiv preprint arXiv:1906.05474, 2019 - arxiv.org
Inspired by the success of the General Language Understanding Evaluation benchmark, we
introduce the Biomedical Language Understanding Evaluation (BLUE) benchmark to …

Clinicalbert: Modeling clinical notes and predicting hospital readmission

K Huang, J Altosaar, R Ranganath - arXiv preprint arXiv:1904.05342, 2019 - arxiv.org
Clinical notes contain information about patients that goes beyond structured data like lab
values and medications. However, clinical notes have been underused relative to structured …

Pretrained language models for biomedical and clinical tasks: understanding and extending the state-of-the-art

P Lewis, M Ott, J Du, V Stoyanov - Proceedings of the 3rd clinical …, 2020 - aclanthology.org
A large array of pretrained models are available to the biomedical NLP (BioNLP) community.
Finding the best model for a particular task can be difficult and time-consuming. For many …

CharacterBERT: Reconciling ELMo and BERT for word-level open-vocabulary representations from characters

HE Boukkouri, O Ferret, T Lavergne, H Noji… - arXiv preprint arXiv …, 2020 - arxiv.org
Due to the compelling improvements brought by BERT, many recent representation models
adopted the Transformer architecture as their main building block, consequently inheriting …

Pre-trained language models in biomedical domain: A systematic survey

B Wang, Q Xie, J Pei, Z Chen, P Tiwari, Z Li… - ACM Computing …, 2023 - dl.acm.org
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …

Challenges and opportunities beyond structured data in analysis of electronic health records

M Tayefi, P Ngo, T Chomutare… - Wiley …, 2021 - Wiley Online Library
Electronic health records (EHR) contain a lot of valuable information about individual
patients and the whole population. Besides structured data, unstructured data in EHRs can …