[1]李云洁,王丹阳,刘海涛,等.图推理嵌入动态自注意力网络的文档级关系抽取[J].智能系统学报,2025,20(1):52-63.[doi:10.11992/tis.202311021]
 LI Yunjie,WANG Danyang,LIU Haitao,et al.Document-level relation extraction of a graph reasoning embedded dynamic self-attention network[J].CAAI Transactions on Intelligent Systems,2025,20(1):52-63.[doi:10.11992/tis.202311021]
点击复制

图推理嵌入动态自注意力网络的文档级关系抽取

参考文献/References:
[1] YU Mo, YIN Wenpeng, HASAN K S, et al. Improved neural relation detection for knowledge base question answering[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Vancouver: Association for Computational Linguistics, 2017: 571-581.
[2] DAS R, MUNKHDALAI T, YUAN Xingdi, et al. Building dynamic knowledge graphs from text using machine reading comprehension[EB/OL]. (2018-10-12) [2023-10-01]. http://arxiv.org/abs/1810.05682v1.
[3] QIN Pengda, XU Weiran, WANG W Y. Robust distant supervision relation extraction via deep reinforcement learning[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Melbourne: Association for Computational Linguistics, 2018: 2137-2147.
[4] ZENG Daojian, LIU Kang, LAI Siwei, et al. Relation classification via convolutional deep neural network[C]//Proceedings of COLING 2014-25th International Conference on Computational Linguistics. Dublin: ACL, 2014: 2335-2344.
[5] ZENG Xiangrong, HE Shizhu, LIU Kang, et al. Large scaled relation extraction with reinforcement learning[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Palo Alto: AAAI, 2018: 5658-5665.
[6] ZHOU Wenxuan, CHEN Muhao. An improved baseline for sentence-level relation extraction[C]//Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing. Stroudsburg: ACL, 2022: 161-168.
[7] HAN Xu, LIU Zhiyuan, SUN Maosong. Neural knowledge acquisition via mutual attention between knowledge graph and text[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Palo Alto: AAAI, 2018: 4832-4839.
[8] YAO Yuan, YE Deming, LI Peng, et al. DocRED: a large-scale document-level relation extraction dataset[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: Association for Computational Linguistics, 2019: 764-777.
[9] SHEN Tao, ZHOU Tianyi, LONG Guodong, et al. DiSAN: directional self-attention network for RNN/CNN-free language understanding[C]// Proceedings of the AAAI Conference on Artificial Intelligence. Palo Alto: AAAI, 2018: 5446-5455.
[10] VERGA P, STRUBELL E, MCCALLUM A. Simultaneously self-attending to all mentions for full-abstract biological relation extraction[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. New Orleans: Association for Computational Linguistics, 2018: 872-884.
[11] SAHU S K, CHRISTOPOULOU F, MIWA M, et al. Inter-sentence relation extraction with document-level graph convolutional neural network[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence: Association for Computational Linguistics, 2019: 4309-4316.
[12] XU Wang, CHEN Kehai, MOU Lili, et al. Document-level relation extraction with sentences importance estimation and focusing[C]//Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg: Association for Computational Linguistics, 2022: 2920-2929.
[13] 吴婷, 孔芳. 基于图注意力卷积神经网络的文档级关系抽取[J]. 中文信息学报, 2021, 35(10): 73-80.
WU Ting, KONG Fang. Document-level relation extraction based on graph attention convolutional neural network[J]. Journal of Chinese information processing, 2021, 35(10): 73-80.
[14] PENG Nanyun, POON H, QUIRK C, et al. Cross-sentence N-ary relation extraction with graph LSTMs[J]. Transactions of the association for computational linguistics, 2017, 5: 101-115.
[15] QUIRK C, POON H. Distant supervision for relation extraction beyond the sentence boundary[C]//Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers. Stroudsburg: Association for Computational Linguistics, 2017: 1171-1182.
[16] GUPTA P, RAJARAM S, SCHüTZE H, et al. Neural relation extraction within and across sentence boundaries[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Palo Alto: AAAI, 2019: 6513-6520.
[17] CHRISTOPOULOU F, MIWA M, ANANIADOU S. Connecting the dots: document-level neural relation extraction with edge-oriented graphs[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2019: 4925-4936.
[18] ZHUANG Yimeng, WANG Huadong. Token-level dynamic self-attention network for multi-passage reading comprehension[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2019: 2252-2262.
[19] SONG Linfeng, WANG Zhiguo, YU Mo, et al. Exploring graph-structured passage representation for multi-hop reading comprehension with graph neural networks[EB/OL]. (2018-9-06)[2023-10-01]. http://arxiv.org/abs/1809.02040v1.
[20] TANG Hengzhu, CAO Yanan, ZHANG Zhenyu, et al. HIN: hierarchical inference network for document-level relation extraction[M]//Lecture Notes in Computer Science. Cham: Springer International Publishing, 2020: 197-209.
[21] LIU Hongfei, KANG Zhao, ZHANG Lizong, et al. Document-level relation extraction with cross-sentence reasoning graph[M]//Lecture Notes in Computer Science. Cham: Springer Nature Switzerland, 2023: 316-328.
[22] NAN Guoshun, GUO Zhijiang, SEKULIC I, et al. Reasoning with latent structure refinement for document-level relation extraction[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2020: 1546-1557.
[23] ZENG Shuang, XU Runxin, CHANG Baobao, et al. Double graph based reasoning for document-level relation extraction[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2020: 1630-1640.
[24] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. Cambridge: MIT Press, 2017: 6000-6010.
[25] 李祥宇, 隋璘, 熊伟丽. 基于自注意力机制与卷积 ONLSTM 网络的软测量算法[J]. 智能系统学报, 2023, 18(5): 957-965.
LI Xiangyu, SUI Lin, XIONG Weili. Soft sensor algorithm based on self-attention mechanism and convolutional ONLSTM network[J]. CAAI transactions on intelligent systems, 2023, 18(5): 957-965.
[26] DEVLIN J, CHANG Mingwei, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of NAACL- HLT. Stroudsburg: ACL, 2019: 4171-4186.
[27] HU Minghao, PENG Yuxing, HUANG Zhen, et al. Reinforced mnemonic reader for machine reading comprehension[C]//Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence. Stockholm: International Joint Conferences on Artificial Intelligence Organization, 2018: 4099-4106.
[28] SHAW P, USZKOREIT J, VASWANI A. Self-attention with relative position representations[EB/OL]. (2018-03-06)[2023-12-01]. http://arxiv.org/abs/1803.02155v2.
[29] YANG Baosong, WANG Longyue, WONG D, et al. Convolutional self-attention network[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg: ACL, 2019: 4040–4045.
[30] TAY Y, DEHGHANI M, BAHRI D, et al. Efficient transformers: a survey[J]. ACM computing surveys, 2023, 55(6): 1-28.
[31] VELI?KOVI? P, CUCURULL G, CASANOVA A, et al. Graph attention networks[EB/OL]. (2017-10-30)[2023-10-01]. http://arxiv.org/abs/1710.10903v3.
[32] WANG Difeng, HU Wei, CAO Ermei, et al. Global-to-local neural networks for document-level relation extraction[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2020: 3711-3721.
[33] LI Jiao, SUN Yueping, JOHNSON R J, et al. BioCreative V CDR task corpus: a resource for chemical disease relation extraction[J]. Database, 2016, 2016: baw068.
[34] LOSHCHILOV I, HUTTER F. Decoupled weight decay regularization[EB/OL]. (2017-11-14)[2023-10-01]. http://arxiv.org/abs/1711.05101v3.
[35] LEE J, YOON W, KIM S, et al. BioBERT: a pre-trained biomedical language representation model for biomedical text mining[J]. Bioinformatics, 2020, 36(4): 1234-1240.
[36] ZHANG Yuhao, QI Peng, MANNING C D. Graph convolution over pruned dependency trees improves relation extraction[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2018: 2205-2215.
[37] ZHOU Huiwei, DENG Huijie, CHEN Long, et al. Exploiting syntactic and semantics information for chemical-disease relation extraction[J]. Database, 2016, 2016: baw048.
[38] GU Jinghang, SUN Fuqing, QIAN Longhua, et al. Chemical-induced disease relation extraction via convolutional neural network[J]. Database, 2017, 2017: bax024.
[39] PANYAM N C, VERSPOOR K, COHN T, et al. Exploiting graph kernels for high performance biomedical relation extraction[J]. Journal of biomedical semantics, 2018, 9(1): 7.
[40] ZHENG Wei, LIN Hongfei, LI Zhiheng, et al. An effective neural model extracting document level chemical-induced disease relations from biomedical literature[J]. Journal of biomedical informatics, 2018, 83: 1-9.
[41] LI Jing, WANG Yequan, ZHANG Shuai, et al. Rethinking document-level relation extraction: a reality check[C]//Findings of the Association for Computational Linguistics: ACL 2023. Stroudsburg: Association for Computational Linguistics, 2023: 5715–5730.
[42] GUO Zhijiang, ZHANG Yan, LU Wei. Attention guided graph convolutional networks for relation extraction[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2019: 241-251.
[43] 陈容珊, 高淑萍, 齐小刚. 注意力机制和图卷积神经网络引导的谱聚类方法[J]. 智能系统学报, 2023, 18(5): 936-944.
CHEN Rongshan, GAO Shuping, QI xiaogang. A spectral clustering based on GCNs and attention menchanism[J]. CAAI transactions on intelligent systems, 2023, 18(5): 936-944.
[44] 朱金霞, 孟祥福, 邢长征, 等. 融合图卷积注意力机制的协同过滤推荐方法[J]. 智能系统学报, 2023, 18(6): 1295-1304.
ZHU Jinxia, MENG Xiangfu, XING Changzheng, et al. Collaborative filter recommendation approach fused with graph convolutional attention mechanism[J]. CAAI transactions on intelligent systems, 2023, 18(6): 1295-1304.
[45] 赵鹏武, 李志义, 林小琦. 基于注意力机制和卷积神经网络的中文人物关系抽取与识别[J]. 数据分析与应用, 2022, 6(8): 41-51.
ZHAO Pengwu, LI Zhiyi, LIN Xiaoqi. Chinese character relation extraction and recognition based on attention mechanism and convolution neural network[J]. Data Analysls and knowledge discovery, 2022, 6(8): 41-51.
[46] 张鲁, 段友祥, 刘娟, 等. 基于RoBERTa和加权图卷积网络的中文地质实体关系抽取[J]. 计算机科学, 2024, 51(8): 297-303.
ZHANG Lu, DUAN Youxiang, LIU Juan, et al. Chinese geological entity relation extraction based on RoBERTa and weighted GCNs[J]. Computer dcience, 2024, 51(8): 297-303.
[47] 袁泉, 陈昌平, 陈泽, 等. 基于BERT的两次注意力机制远程监督关系抽取[J]. 计算机应用, 2024, 44(4): 1080-1085.
YUAN Quan, CHEN Changping, CHEN Ze, et al. Twice attention mechanism distantly supervised relation extraction based on BERT[J]. Journal of computer applications., 2024, 44(4): 1080-1085.
[48] GIORGI J, BADER G D, Wang Bo. A sequence-to-sequence approach for document-level relation extraction[EB/OL]. (2022-04-03) [2023-10-01]. http://arxiv.org/abs/2204.01098.
[49] 文坤建, 陈艳平, 黄瑞章, 等. 基于提示学习的生物医学关系抽取方法[J]. 计算机科学, 2023, 50(10): 223-229.
WEN Jiankun, CHEN Yanping, HUANG Ruizhang, et al. Biomedical relationship extraction method based on prompt learning[J]. Computer science, 2023, 50(10): 223-229.
[50] 赵晋斌, 王琦, 马黎雨, 等. 基于知识图谱的远程监督关系抽取降噪方法[J]. 火力与指挥控制, 2023, 48(10): 160-169.
ZHAO Jinbin, WANG Qi, MA Liyu, et al. A noise reduction method for distant supervision relation extraction based on knowledge graph[J]. Fire control and command control, 2023, 48(10): 160-169.
[51] 曾碧卿, 李砚龙, 蔡剑. 基于外部知识增强的远程监督关系抽取模型[J]. 计算机应用系统, 2023, 32(5): 253-261.
ZENG Biqing, LI Yanlong, CAI Jian. Distantly-supervised relation extraction model via external knowledge enhancement[J]. Computer systems and applications, 2023, 32(5): 253-261.
[52] WANG Hong, FOCKE C, SYLVESTER R, et al. Fine-tune bert for DocRED with two-step process[EB/OL]. (2019-09-26)[2023-10-01]. http://arxiv.org/abs/1909.11898v1.
[53] YE Deming, LIN Yankai, DU Jiaju, et al. Coreferential reasoning learning for language representation[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2020: 7170-7186.
[54] YU Jiaxin, YANG Deqing, TIAN Shuyu. Relation-specific attentions over entity mentions for enhanced document-level relation extraction[EB/OL]. (2022-05-28)[2023-10-01]. http://arxiv.org/abs/2205.14393v1.

备注/Memo

收稿日期:2023-11-17。
基金项目:国家自然科学基金项目(61350003);辽宁省教育厅高等学校基本科研项目重点攻关项目(LJKZZ20220047);中央级公益性科研院所基本科研业务费专项(1630072023005).
作者简介:李云洁,硕士研究生,主要研究方向为自然语言处理、机器学习。E-mail:2621991259@qq.com。;王丹阳,研究实习员,主要研究方向为智能信息处理。E-mail:danyang.wang@catas.cn。;刘海涛,副教授,博士,主要研究方向为自然语言处理、机器学习、因素空间理论。中国运筹学会模糊信息与工程分会理事。曾获市科技进步一等奖1项,市级自然科学学术成果奖特等奖、一等奖各1项。以第一作者身份发表学术论文30余篇,出版专著《因素空间与人工智能》获国家出版基金资助,并被评为“十三五”国家重点图书出版规划重大出版工程项目。E-mail:haitao641@163.com。
通讯作者:刘海涛. E-mail:haitao641@163.com

更新日期/Last Update: 2025-01-05
Copyright © 《 智能系统学报》 编辑部
地址:(150001)黑龙江省哈尔滨市南岗区南通大街145-1号楼 电话:0451- 82534001、82518134 邮箱:tis@vip.sina.com