Deep transfer learning & beyond: Transformer language models in information systems research

R Gruetzemacher, D Paradice - ACM Computing Surveys (CSUR), 2022 - dl.acm.org
AI is widely thought to be poised to transform business, yet current perceptions of the scope
of this transformation may be myopic. Recent progress in natural language processing …

A survey on machine reading comprehension systems

R Baradaran, R Ghiasi, H Amirkhani - Natural Language Engineering, 2022 - cambridge.org
Machine Reading Comprehension (MRC) is a challenging task and hot topic in Natural
Language Processing. The goal of this field is to develop systems for answering the …

Revisiting pre-trained models for Chinese natural language processing

Y Cui, W Che, T Liu, B Qin, S Wang, G Hu - arXiv preprint arXiv …, 2020 - arxiv.org
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous
improvements across various NLP tasks, and consecutive variants have been proposed to …

Albert: A lite bert for self-supervised learning of language representations

Z Lan, M Chen, S Goodman, K Gimpel… - arXiv preprint arXiv …, 2019 - arxiv.org
Increasing model size when pretraining natural language representations often results in
improved performance on downstream tasks. However, at some point further model …

Xlnet: Generalized autoregressive pretraining for language understanding

Z Yang, Z Dai, Y Yang, J Carbonell… - Advances in neural …, 2019 - proceedings.neurips.cc
With the capability of modeling bidirectional contexts, denoising autoencoding based
pretraining like BERT achieves better performance than pretraining approaches based on …

Pre-training with whole word masking for chinese bert

Y Cui, W Che, T Liu, B Qin… - IEEE/ACM Transactions on …, 2021 - ieeexplore.ieee.org
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous
improvements across various NLP tasks, and its consecutive variants have been proposed …

Semantics-aware BERT for language understanding

Z Zhang, Y Wu, H Zhao, Z Li, S Zhang, X Zhou… - Proceedings of the …, 2020 - ojs.aaai.org
The latest work on language representations carefully integrates contextualized features into
language model training, which enables a series of success especially in various machine …

Cosmos QA: Machine reading comprehension with contextual commonsense reasoning

L Huang, RL Bras, C Bhagavatula, Y Choi - arXiv preprint arXiv …, 2019 - arxiv.org
Understanding narratives requires reading between the lines, which in turn, requires
interpreting the likely causes and effects of events, even when they are not mentioned …

Retrospective reader for machine reading comprehension

Z Zhang, J Yang, H Zhao - Proceedings of the AAAI conference on …, 2021 - ojs.aaai.org
Abstract Machine reading comprehension (MRC) is an AI challenge that requires machines
to determine the correct answers to questions based on a given passage. MRC systems …

[HTML][HTML] Meta-learning approaches for learning-to-learn in deep learning: A survey

Y Tian, X Zhao, W Huang - Neurocomputing, 2022 - Elsevier
Compared to traditional machine learning, deep learning can learn deeper abstract data
representation and understand scattered data properties. It has gained considerable …