An Effective Gated Recurrent Unit Network Model for Chinese Relation Extraction
Abstract
Relation extraction becomes one important task in information extraction and it is becoming more and more used to discover relations from text. However, it is difficult to combine context information for relation extraction and some important information may appear many times in one sentence. In order to solve these problems, this paper presents Gated Recurrent Unit Networks with attention (CHGRU) model to catch important information in a sentence. In this model, we use the gated unit networks for embedding sentence semantics. Then, we build word-level attention to capture informative sentences with accurate relation patterns. On real-world datasets, our experimental results gains great improvements about relation extraction as compared with others.
Keywords
Relation extraction, Attention, Gated recurrent unit
DOI
10.12783/dtcse/wcne2017/19833
10.12783/dtcse/wcne2017/19833
Refbacks
- There are currently no refbacks.