[1]CHE Feihu,ZHANG Dawei,SHAO Pengpeng,et al.Script event prediction based on a quaternion-gated graph neural network[J].CAAI Transactions on Intelligent Systems,2023,18(1):138-143.[doi:10.11992/tis.202203042]
Copy
CAAI Transactions on Intelligent Systems[ISSN 1673-4785/CN 23-1538/TP] Volume:
18
Number of periods:
2023 1
Page number:
138-143
Column:
学术论文—自然语言处理与理解
Public date:
2023-01-05
- Title:
-
Script event prediction based on a quaternion-gated graph neural network
- Author(s):
-
CHE Feihu1; 2; ZHANG Dawei1; SHAO Pengpeng1; 2; YANG Guohua1; LIU Tong1; TAO Jianhua1; 2; 3
-
1. National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China;
2. School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China;
3. CAS Center for Excellence in Brain Science and Intelligence Technology, Beijing 100190, China
-
- Keywords:
-
quaternion; gated graph neural network; event representation; script event prediction; attention mechanism; event evolutionary graph; graph convolution networks; event interaction
- CLC:
-
TP183
- DOI:
-
10.11992/tis.202203042
- Abstract:
-
Two types of information sources are essential for script event prediction: the correlation between events and the inner interactions within one event. For the first information source, we use a gated graph neural network to model the correlation between events. For the inner interactions within one event, we use quaternion to model the event, and then we use the Hamilton product of quaternion to capture the inner interactions of four components. We propose to learn event representation by combining quaternion and a gated graph neural network. This approach considers the interaction of external event diagrams and the dependence within an event. After obtaining the event representation, we use an attention mechanism to learn the context event representation and the relative weight of each candidate context representation. Next, we calculate the sum of the context event embeddings through the weights, and then we calculate the Euclidean distance between the context event embedding sum and the candidate event embedding. Finally, we choose the candidate event with the smallest distance as the right candidate event. The results of experiments conducted on the New York Times corpus show that our proposed model is superior to the existing state-of-the-art baseline models through evaluation using a multiple-choice narrative cloze test.