[1]GUO Yinan,WANG Bin,GONG Dunwei,et al.Multi-layer attention knowledge representation learning by integrating entity structure with semantics[J].CAAI Transactions on Intelligent Systems,2023,18(3):577-588.[doi:10.11992/tis.202204026]
Copy
CAAI Transactions on Intelligent Systems[ISSN 1673-4785/CN 23-1538/TP] Volume:
18
Number of periods:
2023 3
Page number:
577-588
Column:
学术论文—知识工程
Public date:
2023-07-05
- Title:
-
Multi-layer attention knowledge representation learning by integrating entity structure with semantics
- Author(s):
-
GUO Yinan1; 2; WANG Bin1; GONG Dunwei1; YU Zekuan3
-
1. School of Information and Control Engineering, China University of Mining and Technology, Jiangsu 221116, China;
2. School of Mechatronics and Information Engineering, China University of Mining and Technology (Beijing), Beijing 100083, China;
3. Institute of Engineering and Applied Technology, Fudan University, Shanghai 200433, China
-
- Keywords:
-
knowledge representation learning; entity structure embedding; semantic information; attention mechanism; knowledge graph; knowledge reasoning; complex entity description; Transformer
- CLC:
-
TP305
- DOI:
-
10.11992/tis.202204026
- Abstract:
-
Though the structure and relationship embedding of entities can be obtained by knowledge representation learning based on the knowledge graph, there lacks of the semantic information utilization of entity description texts. In addition, with the increase of the scale of knowledge graph, the categories and quantities of entities and their relationships, as well as the contents and sources of entity descriptions increase accordingly. It is more difficult to obtain the corresponding relationship between entity text descriptions and triple structure information. Therefore, this paper presents a multi-layer attention knowledge representation learning method, which integrates entity structure with semantics. By constructing a multi-layer attention mechanism, the structural embedding of entities is used to enhance the semantic expression in entity description, and then the semantic relationship of entity description is obtained by Transformer model, and it is enhanced and integrated by structural embedding of relationships. Finally, the integrated semantic relationship is used to enrich and integrate the relationship embedding set. Specifically, a loss function is constructed for a multi-layer attention mechanism, which integrates entity structure with semantic information. The experimental results show that the method proposed in this paper can effectively infer the hidden link relationship between entities containing complex entity descriptions, and has more accurate classification accuracy than the other similar methods in triple classification tasks.