A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …

A review on language models as knowledge bases

B AlKhamissi, M Li, A Celikyilmaz, M Diab… - arXiv preprint arXiv …, 2022 - arxiv.org
Recently, there has been a surge of interest in the NLP community on the use of pretrained
Language Models (LMs) as Knowledge Bases (KBs). Researchers have shown that LMs …

Is ChatGPT a general-purpose natural language processing task solver?

C Qin, A Zhang, Z Zhang, J Chen, M Yasunaga… - arXiv preprint arXiv …, 2023 - arxiv.org
Spurred by advancements in scale, large language models (LLMs) have demonstrated the
ability to perform a variety of natural language processing (NLP) tasks zero-shot--ie, without …

A large language model for electronic health records

X Yang, A Chen, N PourNejatian, HC Shin… - NPJ digital …, 2022 - nature.com
There is an increasing interest in developing artificial intelligence (AI) systems to process
and interpret electronic health records (EHRs). Natural language processing (NLP) powered …

BioGPT: generative pre-trained transformer for biomedical text generation and mining

R Luo, L Sun, Y Xia, T Qin, S Zhang… - Briefings in …, 2022 - academic.oup.com
Pre-trained language models have attracted increasing attention in the biomedical domain,
inspired by their great success in the general natural language domain. Among the two main …

Unified named entity recognition as word-word relation classification

J Li, H Fei, J Liu, S Wu, M Zhang, C Teng… - proceedings of the AAAI …, 2022 - ojs.aaai.org
So far, named entity recognition (NER) has been involved with three major types, including
flat, overlapped (aka. nested), and discontinuous NER, which have mostly been studied …

[HTML][HTML] Ptr: Prompt tuning with rules for text classification

X Han, W Zhao, N Ding, Z Liu, M Sun - AI Open, 2022 - Elsevier
Recently, prompt tuning has been widely applied to stimulate the rich knowledge in pre-
trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved …

Template-based named entity recognition using BART

L Cui, Y Wu, J Liu, S Yang, Y Zhang - arXiv preprint arXiv:2106.01760, 2021 - arxiv.org
There is a recent interest in investigating few-shot NER, where the low-resource target
domain has different label sets compared with a resource-rich source domain. Existing …

Knowprompt: Knowledge-aware prompt-tuning with synergistic optimization for relation extraction

X Chen, N Zhang, X Xie, S Deng, Y Yao, C Tan… - Proceedings of the …, 2022 - dl.acm.org
Recently, prompt-tuning has achieved promising results for specific few-shot classification
tasks. The core idea of prompt-tuning is to insert text pieces (ie, templates) into the input and …

REBEL: Relation extraction by end-to-end language generation

PLH Cabot, R Navigli - Findings of the Association for …, 2021 - aclanthology.org
Extracting relation triplets from raw text is a crucial task in Information Extraction, enabling
multiple applications such as populating or validating knowledge bases, factchecking, and …