Semantics-aware BERT for language understanding
The latest work on language representations carefully integrates contextualized features into
language model training, which enables a series of success especially in various machine …
language model training, which enables a series of success especially in various machine …
Simple bert models for relation extraction and semantic role labeling
We present simple BERT-based models for relation extraction and semantic role labeling. In
recent years, state-of-the-art performance has been achieved using neural models by …
recent years, state-of-the-art performance has been achieved using neural models by …
Natural language processing advancements by deep learning: A survey
Natural Language Processing (NLP) helps empower intelligent machines by enhancing a
better understanding of the human language for linguistic-based human-computer …
better understanding of the human language for linguistic-based human-computer …
Retrospective reader for machine reading comprehension
Abstract Machine reading comprehension (MRC) is an AI challenge that requires machines
to determine the correct answers to questions based on a given passage. MRC systems …
to determine the correct answers to questions based on a given passage. MRC systems …
DeepStruct: Pretraining of language models for structure prediction
We introduce a method for improving the structural understanding abilities of language
models. Unlike previous approaches that finetune the models with task-specific …
models. Unlike previous approaches that finetune the models with task-specific …
Encoder-decoder based unified semantic role labeling with label-aware syntax
Currently the unified semantic role labeling (SRL) that achieves predicate identification and
argument role labeling in an end-to-end manner has received growing interests. Recent …
argument role labeling in an end-to-end manner has received growing interests. Recent …
Fusing heterogeneous factors with triaffine mechanism for nested named entity recognition
Nested entities are observed in many domains due to their compositionality, which cannot
be easily recognized by the widely-used sequence labeling framework. A natural solution is …
be easily recognized by the widely-used sequence labeling framework. A natural solution is …
LIMIT-BERT: Linguistic informed multi-task bert
In this paper, we present a Linguistic Informed Multi-Task BERT (LIMIT-BERT) for learning
language representations across multiple linguistic tasks by Multi-Task Learning (MTL) …
language representations across multiple linguistic tasks by Multi-Task Learning (MTL) …
Filling the gap of utterance-aware and speaker-aware representation for multi-turn dialogue
A multi-turn dialogue is composed of multiple utterances from two or more different speaker
roles. Thus utterance-and speaker-aware clues are supposed to be well captured in models …
roles. Thus utterance-and speaker-aware clues are supposed to be well captured in models …
Span model for open information extraction on accurate corpus
J Zhan, H Zhao - Proceedings of the AAAI Conference on Artificial …, 2020 - aaai.org
Abstract Open Information Extraction (Open IE) is a challenging task especially due to its
brittle data basis. Most of Open IE systems have to be trained on automatically built corpus …
brittle data basis. Most of Open IE systems have to be trained on automatically built corpus …