Multi-level attention-based neural networks for distant supervised relation extraction
We propose a multi-level attention-based neural network forrelation extraction based on the
work of Lin et al. to alleviate the problemof wrong labelling in distant supervision. In this
paper, we first adoptgated recurrent units to represent the semantic information. Then,
weintroduce a customized multi-level attention mechanism, which is expectedto reduce the
weights of noisy words and sentences. Experimentalresults on a real-world dataset show
that our model achieves significantimprovement on relation extraction tasks compared to …
work of Lin et al. to alleviate the problemof wrong labelling in distant supervision. In this
paper, we first adoptgated recurrent units to represent the semantic information. Then,
weintroduce a customized multi-level attention mechanism, which is expectedto reduce the
weights of noisy words and sentences. Experimentalresults on a real-world dataset show
that our model achieves significantimprovement on relation extraction tasks compared to …
Abstract
We propose a multi-level attention-based neural network forrelation extraction based on the work of Lin et al. to alleviate the problemof wrong labelling in distant supervision. In this paper, we first adoptgated recurrent units to represent the semantic information. Then, weintroduce a customized multi-level attention mechanism, which is expectedto reduce the weights of noisy words and sentences. Experimentalresults on a real-world dataset show that our model achieves significantimprovement on relation extraction tasks compared to both traditionalfeature-based models and existing neural network-based methods
researchrepository.ucd.ie
以上显示的是最相近的搜索结果。 查看全部搜索结果