[1]CHE Feihu,ZHANG Dawei,SHAO Pengpeng,et al.Script event prediction based on a quaternion-gated graph neural network[J].CAAI Transactions on Intelligent Systems,2023,18(1):138-143.[doi:10.11992/tis.202203042]
Copy

Script event prediction based on a quaternion-gated graph neural network

References:
[1] LI ZHONGYANG, DING XIAO, LIU TING. Constructing narrative event evolutionary graph for script event prediction[EB/OL]. (2018?05?14)[2022?03?23].https://arxiv.org/abs/1805.05081.
[2] GEERAERTS, DIKR. Cognitive Linguistics: Basic Readings[M]. Berlin: De Gruyter Mouton, 2008. : 373–400.
[3] RUMELHART D E. Notes on a schema for stories[M]//Representation and Understanding. Amsterdam: Elsevier, 1975: 211?236.
[4] WANG Zhongqing, ZHANG Yue, CHANG C Y. Integrating order information and event relation for script event prediction[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: Association for Computational Linguistics, 2017: 57–67.
[5] MIZOGUCHI T, YAMADA I. Hyper complex tensor completion with cayley-dickson singular value decomposition[C]//2018 IEEE International Conference on Acoustics, Speech and Signal Processing. Calgary: IEEE, 2018: 3979?3983.
[6] XIANG Min, KANNA S, MANDIC D P. Performance analysis of quaternion-valued adaptive filters in nonstationary environments[J]. IEEE transactions on signal processing, 2018, 66(6): 1566–1579.
[7] ORTOLANI F, COMMINIELLO D, SCARPINITI M, et al. Frequency domain quaternion adaptive filters: Algorithms and convergence performance[J]. Signal processing, 2017, 136: 69–80.
[8] PARCOLLET T, MORCHID M, BOUSQUET P M, et al. Quaternion neural networks for spoken language understanding[C]//2016 IEEE Spoken Language Technology Workshop. San Diego: IEEE, 2016: 362?368.
[9] TITOUAN Parcollet, MIRCO Ravanelli, MOHAMED Morchid, et al. Quaternion recurrent neural networks[EB/OL] (2018?06?12) [2022?03?23].https://arxiv.org/abs/1806.04418.
[10] GAUDET C J, MAIDA A S. Deep quaternion networks[C]//2018 International Joint Conference on Neural Networks. Rio de Janeiro: IEEE, 2018: 1-8.
[11] PEROZZI B, AL-RFOU R, SKIENA S. DeepWalk: online learning of social representations[C]// Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. New York: Association for Computing Machinery, 2014: 701–710.
[12] KIPF T, WELLING M. Semi-supervised classification with graph convolutional networks[EB/OL]. (2016?09?09) [2022?03?23].https://arxiv.org/abs/1609.02907.
[13] WANG Tingwu, ZHOU Yuhao, FIDLER S, et al. Neural graph evolution: towards efficient automatic robot design[EB/OL]. (2016?06?12) [2022?03?23].https://arxiv.org/abs/1906.05370.
[14] LI YUJIA, TARLOW D, BROCKSCHMIDT M, et al. Gated graph sequence neural networks[EB/OL]. (2015?11?17) [2022?03?23].https://arxiv.org/abs/1511.05493.
[15] NATHANAEL Chambers, DAN Jurafsky. Unsupervised learning of narrative event chains[C]//Proceedings of ACL-08: HLT. Columbus: Association for Computational Linguistics, 2008: 789–797.
[16] XU Dongpo, ZHANG Lina, ZHANG Huisheng. Learning algorithms in quaternion neural networks using ghr calculus[J]. Neural network world, 2017, 27(3): 271–282.
[17] CHO K, VAN MERRIENBOER B, BAHDANAU D, et al. On the properties of neural machine translation: encoder-decoder approaches[EB/OL]. (2014?09?03) [2022?03?23].https://arxiv.org/abs/1409.1259.
[18] ZHOU Bo, CHEN Yubo, LIU Kang, et al. Multi-task self-supervised learning for script event prediction[C]//CIKM’21: Proceedings of the 30th ACM International Conference on Information & Knowledge Management. New York: ACM, 2021: 3662?3666.
[19] SCHLICHTKRULL M, KIPF T N, BLOEM P, et al. Modeling relational data with graph convolutional networks[M]//The Semantic Web. Cham: Springer International Publishing, 2018: 593?607.
[20] CHO K, VAN MERRIENBOER B, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[EB/OL]. (2014?06?03) [2022?03?23].https://arxiv.org/abs/1406.1078.
[21] BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[EB/OL]. (2014?09?01) [2022?03?23].https://arxiv.org/abs/1409.0473.
[22] CURRAN J R, CLARK S, BOS J. Linguistically motivated large-scale NLP with C&C and boxer[C]//ACL’07: Proceedings of the 45th Annual Meeting of the ACL on Interactive Poster and Demonstration Sessions. New York: ACM, 2007: 33?36.
[23] JANS B, BETHARD S, VULI? I, et al. Skip n-grams and ranking functions for predicting script events[C]//EACL’12: Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics. New York: ACM, 2012: 336?344.
[24] MIKOLOV T, SUTSKEVER I, CHEN KAI, et al. Distributed representations of words and phrases and their compositionality[EB/OL]. (2013?10?16)[2022?03?23].https://arxiv.org/abs/1310.4546.
[25] GRANROTH-WILDING M, CLARK S. What happens next? event prediction using a compositional neural network model[C]// Proceedings of the 30th AAAI Conference on Artificial Intelligence and the 28th Innovative Applications of Artificial Intelligence Conference. Phoenix: AAAI Press, 2016, 30(1).
Similar References:

Memo

-

Last Update: 1900-01-01

Copyright © CAAI Transactions on Intelligent Systems