A survey of human-in-the-loop for machine learning

X Wu, L Xiao, Y Sun, J Zhang, T Ma, L He - Future Generation Computer …, 2022 - Elsevier
Abstract Machine learning has become the state-of-the-art technique for many tasks
including computer vision, natural language processing, speech processing tasks, etc …

[HTML][HTML] Pre-trained language models and their applications

H Wang, J Li, H Wu, E Hovy, Y Sun - Engineering, 2022 - Elsevier
Pre-trained language models have achieved striking success in natural language
processing (NLP), leading to a paradigm shift from supervised learning to pre-training …

The programmer's assistant: Conversational interaction with a large language model for software development

SI Ross, F Martinez, S Houde, M Muller… - Proceedings of the 28th …, 2023 - dl.acm.org
Large language models (LLMs) have recently been applied in software engineering to
perform tasks such as translating code between programming languages, generating code …

Gpt (generative pre-trained transformer)–a comprehensive review on enabling technologies, potential applications, emerging challenges, and future directions

G Yenduri, M Ramalingam, GC Selvi, Y Supriya… - IEEE …, 2024 - ieeexplore.ieee.org
The Generative Pre-trained Transformer (GPT) represents a notable breakthrough in the
domain of natural language processing, which is propelling us toward the development of …

[HTML][HTML] Large-scale multi-modal pre-trained models: A comprehensive survey

X Wang, G Chen, G Qian, P Gao, XY Wei… - Machine Intelligence …, 2023 - Springer
With the urgent demand for generalized deep models, many pre-trained big models are
proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT) …

Pre-trained language models for text generation: A survey

J Li, T Tang, WX Zhao, JY Nie, JR Wen - ACM Computing Surveys, 2024 - dl.acm.org
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …

Rethinking explainability as a dialogue: A practitioner's perspective

H Lakkaraju, D Slack, Y Chen, C Tan… - arXiv preprint arXiv …, 2022 - arxiv.org
As practitioners increasingly deploy machine learning models in critical domains such as
health care, finance, and policy, it becomes vital to ensure that domain experts function …

[HTML][HTML] BERT models for Arabic text classification: a systematic review

AS Alammary - Applied Sciences, 2022 - mdpi.com
Bidirectional Encoder Representations from Transformers (BERT) has gained increasing
attention from researchers and practitioners as it has proven to be an invaluable technique …

Taxonomic classification of DNA sequences beyond sequence similarity using deep neural networks

F Mock, F Kretschmer, A Kriese… - Proceedings of the …, 2022 - National Acad Sciences
Taxonomic classification, that is, the assignment to biological clades with shared ancestry, is
a common task in genetics, mainly based on a genome similarity search of large genome …

[HTML][HTML] Conversational question answering: A survey

M Zaib, WE Zhang, QZ Sheng, A Mahmood… - … and Information Systems, 2022 - Springer
Question answering (QA) systems provide a way of querying the information available in
various formats including, but not limited to, unstructured and structured data in natural …