[1]郭一楠,王斌,巩敦卫,等.实体结构与语义融合的多层注意力知识表示学习[J].智能系统学报,2023,18(3):577-588.[doi:10.11992/tis.202204026]
GUO Yinan,WANG Bin,GONG Dunwei,et al.Multi-layer attention knowledge representation learning by integrating entity structure with semantics[J].CAAI Transactions on Intelligent Systems,2023,18(3):577-588.[doi:10.11992/tis.202204026]
点击复制
《智能系统学报》[ISSN 1673-4785/CN 23-1538/TP] 卷:
18
期数:
2023年第3期
页码:
577-588
栏目:
学术论文—知识工程
出版日期:
2023-07-05
- Title:
-
Multi-layer attention knowledge representation learning by integrating entity structure with semantics
- 作者:
-
郭一楠1,2, 王斌1, 巩敦卫1, 于泽宽3
-
1. 中国矿业大学 信息与控制工程学院, 江苏 徐州 221116;
2. 中国矿业大学(北京) 机电与信息工程学院, 北京 100083;
3. 复旦大学 工程与应用技术研究院, 上海 200433
- Author(s):
-
GUO Yinan1,2, WANG Bin1, GONG Dunwei1, YU Zekuan3
-
1. School of Information and Control Engineering, China University of Mining and Technology, Jiangsu 221116, China;
2. School of Mechatronics and Information Engineering, China University of Mining and Technology (Beijing), Beijing 100083, China;
3. Institute of Engineering and Applied Technology, Fudan University, Shanghai 200433, China
-
- 关键词:
-
知识表示学习; 实体结构嵌入; 语义信息; 注意力机制; 知识图谱; 知识推理; 复杂实体描述; Transformer
- Keywords:
-
knowledge representation learning; entity structure embedding; semantic information; attention mechanism; knowledge graph; knowledge reasoning; complex entity description; Transformer
- 分类号:
-
TP305
- DOI:
-
10.11992/tis.202204026
- 摘要:
-
基于知识图谱的知识表示学习虽然可以获得实体的结构和关系嵌入,但是缺少对实体描述文本的语义信息利用。此外,随着知识图谱规模的增长,实体和关系的类别与数量,以及实体描述的内容和来源随之增加,实体的文本描述与三元组结构信息之间的对应关系更加难以获得。基于此,本文提出一种实体结构与语义融合的多层注意力知识表示学习方法,通过构建多层注意力机制,将实体的结构嵌入用于增强实体描述中的语义表达,再通过Transformer模型获取实体描述的语义关系,并采用关系的结构嵌入对其增强和整合,最后利用整合后的语义关系对关系嵌入集合加以丰富和整合。特别是,构建了面向实体结构与语义融合多层注意力机制的损失函数。实验结果表明,本文所提方法能有效推理包含复杂实体描述的实体之间隐藏链路关系,在三元组分类任务中具有比同类方法更准确的分类精度。
- Abstract:
-
Though the structure and relationship embedding of entities can be obtained by knowledge representation learning based on the knowledge graph, there lacks of the semantic information utilization of entity description texts. In addition, with the increase of the scale of knowledge graph, the categories and quantities of entities and their relationships, as well as the contents and sources of entity descriptions increase accordingly. It is more difficult to obtain the corresponding relationship between entity text descriptions and triple structure information. Therefore, this paper presents a multi-layer attention knowledge representation learning method, which integrates entity structure with semantics. By constructing a multi-layer attention mechanism, the structural embedding of entities is used to enhance the semantic expression in entity description, and then the semantic relationship of entity description is obtained by Transformer model, and it is enhanced and integrated by structural embedding of relationships. Finally, the integrated semantic relationship is used to enrich and integrate the relationship embedding set. Specifically, a loss function is constructed for a multi-layer attention mechanism, which integrates entity structure with semantic information. The experimental results show that the method proposed in this paper can effectively infer the hidden link relationship between entities containing complex entity descriptions, and has more accurate classification accuracy than the other similar methods in triple classification tasks.
备注/Memo
收稿日期:2022-04-16。
基金项目:国家自然科学基金项目(61973305,52121003);恒玖(徐州)智能科技有限公司资助项目(2021360001).
作者简介:郭一楠,教授,主要研究方向为智能数据感知与分析、群智优化与控制。主持国家面上和青年项目3项、国家重点研发计划子课题1项。研究成果获高等学校科学研究优秀成果奖自然科学二等奖、江苏省科学技术二等奖3项,授权发明专利20项,发表包括TEVC、TCYB、TNNLS、TME等在内的中科院一、二区期刊论文43篇,入选ESI前1%高被引论文2篇;王斌,硕士研究生,主要研究方向为知识图谱、自然语言处理。;巩敦卫,教授,主要研究方向为智能优化与控制。主持国家重点研发计划项目1项、国家自然科学基金重点项目1项、面上和青年项目6项、国家重点基础研究发展计划子课题1项、国家重点研发计划子课题1项。研究成果获高等学校科学研究优秀成果奖自然科学二等奖、江苏省科学技术二等奖3项(均排名第1),授权发明专利26项,发表包括IEEETSE、TEVC、TCYB、TNNLS、TASE、TR和ACMTOSEM、ECJ等在内的中科院一、二区期刊论文86篇,入选ESI前1%高被引论文15篇
通讯作者:巩敦卫.E-mail:dwgong@vip.163.com
更新日期/Last Update:
1900-01-01