Pre-trained language models in biomedical domain: A systematic survey
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …
language processing tasks. This also benefits the biomedical domain: researchers from …
[HTML][HTML] AMMU: a survey of transformer-based biomedical pretrained language models
KS Kalyan, A Rajasekharan, S Sangeetha - Journal of biomedical …, 2022 - Elsevier
Transformer-based pretrained language models (PLMs) have started a new era in modern
natural language processing (NLP). These models combine the power of transformers …
natural language processing (NLP). These models combine the power of transformers …
Self-alignment pretraining for biomedical entity representations
Despite the widespread success of self-supervised learning via masked language models
(MLM), accurately capturing fine-grained semantic relationships in the biomedical domain …
(MLM), accurately capturing fine-grained semantic relationships in the biomedical domain …
A survey of large language models for healthcare: from data, technology, and applications to accountability and ethics
The utilization of large language models (LLMs) in the Healthcare domain has generated
both excitement and concern due to their ability to effectively respond to freetext queries with …
both excitement and concern due to their ability to effectively respond to freetext queries with …
BioBART: Pretraining and evaluation of a biomedical generative language model
Pretrained language models have served as important backbones for natural language
processing. Recently, in-domain pretraining has been shown to benefit various domain …
processing. Recently, in-domain pretraining has been shown to benefit various domain …
A survey on clinical natural language processing in the United Kingdom from 2007 to 2022
Much of the knowledge and information needed for enabling high-quality clinical research is
stored in free-text format. Natural language processing (NLP) has been used to extract …
stored in free-text format. Natural language processing (NLP) has been used to extract …
Fast, effective, and self-supervised: Transforming masked language models into universal lexical and sentence encoders
Pretrained Masked Language Models (MLMs) have revolutionised NLP in recent years.
However, previous work has indicated that off-the-shelf MLMs are not effective as universal …
However, previous work has indicated that off-the-shelf MLMs are not effective as universal …
[HTML][HTML] A comprehensive evaluation of large language models on benchmark biomedical text processing tasks
Abstract Recently, Large Language Models (LLMs) have demonstrated impressive
capability to solve a wide range of tasks. However, despite their success across various …
capability to solve a wide range of tasks. However, despite their success across various …
[HTML][HTML] Does the magic of BERT apply to medical code assignment? A quantitative study
S Ji, M Hölttä, P Marttinen - Computers in biology and medicine, 2021 - Elsevier
Unsupervised pretraining is an integral part of many natural language processing systems,
and transfer learning with language models has achieved remarkable results in downstream …
and transfer learning with language models has achieved remarkable results in downstream …
An overview of biomedical entity linking throughout the years
E French, BT McInnes - Journal of biomedical informatics, 2023 - Elsevier
Abstract Biomedical Entity Linking (BEL) is the task of mapping of spans of text within
biomedical documents to normalized, unique identifiers within an ontology. This is an …
biomedical documents to normalized, unique identifiers within an ontology. This is an …