Semantics-aware BERT for language understanding

Z Zhang, Y Wu, H Zhao, Z Li, S Zhang, X Zhou… - Proceedings of the …, 2020 - ojs.aaai.org
The latest work on language representations carefully integrates contextualized features into
language model training, which enables a series of success especially in various machine …

Simple bert models for relation extraction and semantic role labeling

P Shi, J Lin - arXiv preprint arXiv:1904.05255, 2019 - arxiv.org
We present simple BERT-based models for relation extraction and semantic role labeling. In
recent years, state-of-the-art performance has been achieved using neural models by …

Natural language processing advancements by deep learning: A survey

A Torfi, RA Shirvani, Y Keneshloo, N Tavaf… - arXiv preprint arXiv …, 2020 - arxiv.org
Natural Language Processing (NLP) helps empower intelligent machines by enhancing a
better understanding of the human language for linguistic-based human-computer …

Retrospective reader for machine reading comprehension

Z Zhang, J Yang, H Zhao - Proceedings of the AAAI conference on …, 2021 - ojs.aaai.org
Abstract Machine reading comprehension (MRC) is an AI challenge that requires machines
to determine the correct answers to questions based on a given passage. MRC systems …

DeepStruct: Pretraining of language models for structure prediction

C Wang, X Liu, Z Chen, H Hong, J Tang… - arXiv preprint arXiv …, 2022 - arxiv.org
We introduce a method for improving the structural understanding abilities of language
models. Unlike previous approaches that finetune the models with task-specific …

Encoder-decoder based unified semantic role labeling with label-aware syntax

H Fei, F Li, B Li, D Ji - Proceedings of the AAAI conference on artificial …, 2021 - ojs.aaai.org
Currently the unified semantic role labeling (SRL) that achieves predicate identification and
argument role labeling in an end-to-end manner has received growing interests. Recent …

Fusing heterogeneous factors with triaffine mechanism for nested named entity recognition

Z Yuan, C Tan, S Huang, F Huang - arXiv preprint arXiv:2110.07480, 2021 - arxiv.org
Nested entities are observed in many domains due to their compositionality, which cannot
be easily recognized by the widely-used sequence labeling framework. A natural solution is …

LIMIT-BERT: Linguistic informed multi-task bert

J Zhou, Z Zhang, H Zhao, S Zhang - arXiv preprint arXiv:1910.14296, 2019 - arxiv.org
In this paper, we present a Linguistic Informed Multi-Task BERT (LIMIT-BERT) for learning
language representations across multiple linguistic tasks by Multi-Task Learning (MTL) …

Filling the gap of utterance-aware and speaker-aware representation for multi-turn dialogue

L Liu, Z Zhang, H Zhao, X Zhou, X Zhou - Proceedings of the AAAI …, 2021 - ojs.aaai.org
A multi-turn dialogue is composed of multiple utterances from two or more different speaker
roles. Thus utterance-and speaker-aware clues are supposed to be well captured in models …

Span model for open information extraction on accurate corpus

J Zhan, H Zhao - Proceedings of the AAAI Conference on Artificial …, 2020 - aaai.org
Abstract Open Information Extraction (Open IE) is a challenging task especially due to its
brittle data basis. Most of Open IE systems have to be trained on automatically built corpus …