[1]ZHANG Yong,GAO Dalin,GONG Dunwei,et al.Attention graph long short-term memory neural network for relation extraction[J].CAAI Transactions on Intelligent Systems,2021,16(3):518-527.[doi:10.11992/tis.202008036]
Copy
CAAI Transactions on Intelligent Systems[ISSN 1673-4785/CN 23-1538/TP] Volume:
16
Number of periods:
2021 3
Page number:
518-527
Column:
学术论文—知识工程
Public date:
2021-05-05
- Title:
-
Attention graph long short-term memory neural network for relation extraction
- Author(s):
-
ZHANG Yong; GAO Dalin; GONG Dunwei; TAO Yifan
-
School of Information and Control Engineering, China University of Mining and Technology, Xuzhou 221116, China
-
- Keywords:
-
relation extraction; sentence structure tree; syntactic diagram; graph neural network; AGLSTM; soft pruning strategy; attention mechanism; LSTM
- CLC:
-
TP311
- DOI:
-
10.11992/tis.202008036
- Abstract:
-
Relation extraction is a key technology in information acquisition. The sentence structure tree that can capture long-distance dependencies between words has been widely used in relational extraction tasks. However, existing methods still have the disadvantage of relying too much on the information of sentence structure tree and ignoring external information. This paper proposes a new graph neural network structure, namely the attention graph long short term memory neural network (AGLSTM). The model adopts a soft pruning strategy to automatically learn sentence structure information useful for relation extraction tasks; then the attention mechanism is introduced and combined with the syntactic graph information to learn the structural features of the sentence; And designed a new type of graph long short term memory neural network to better fuse syntactic graph information and sentence timing information. Compared with 10 typical relational extraction methods, experiments verify the excellent performance of the proposed method.