You are here

Multi-level Attention-Based Neural Networks for Distant Supervised Relation Extraction

Authors: 

Linyi Yang, Tin Lok, James Ng, Catherine Mooney, Ruihai Dong

Publication Type: 
Refereed Conference Meeting Proceeding
Abstract: 
We propose a multi-level attention-based neural network for relation extraction based on the work of Lin et al. to alleviate the problem of wrong labelling in distant supervision. In this paper, we first adopt gated recurrent units to represent the semantic information. Then, we introduce a customized multi-level attention mechanism, which is expected to reduce the weights of noisy words and sentences. Experimental results on a real-world dataset show that our model achieves significant improvement on relation extraction tasks compared to both traditional feature-based models and existing neural network-based methods
Conference Name: 
Irish Conference on Artificial Intelligence and Cognitive Science), DUBLIN, December 7-8th, 2017
Digital Object Identifer (DOI): 
10.XXX
Publication Date: 
08/12/2017
Conference Location: 
Ireland
Research Group: 
Institution: 
National University of Ireland, Dublin (UCD)
Open access repository: 
No