Hierarchical graph attention network for temporal knowledge graph reasoning
Shao PP(邵朋朋)
刊名Neurocomputing
2023
页码126390
英文摘要

Temporal knowledge graphs (TKGs) reasoning has attracted increasing research interest in recent years. However, most of the existing TKGs reasoning models aim to learn a dynamic entity representation by binding timestamps information with the entities, neglecting to learn adaptive entity representation that is valuable to the query from relevant historical facts. To this end, we propose a Hierarchical Graph Attention neTwork (HGAT) for the TKGs reasoning task. Specifically, we design a hierarchical neighbor encoder to model the time-oriented and task-oriented roles of the entities. The time-aware mechanism is developed in the first layer to differentiate the contributions of query-relevant historical facts at different timestamps to the query. The designed relation-aware attention is used in the second layer to discern the contributions of the structural neighbors of an entity. Through this hierarchical encoder, our model can absorb valuable knowledge effectively from the relevant historical facts, and thus learn more expressive adaptive entity representation for the query. Finally, we evaluate our model performance on four TKGs datasets and justify its superiority against various state-of-the-art baselines.

语种英语
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/52295]  
专题自动化研究所_模式识别国家重点实验室_模式分析与学习团队
作者单位1.2Department of Automation, Tsinghua University
2.The State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Shao PP. Hierarchical graph attention network for temporal knowledge graph reasoning[J]. Neurocomputing,2023:126390.
APA Shao PP.(2023).Hierarchical graph attention network for temporal knowledge graph reasoning.Neurocomputing,126390.
MLA Shao PP."Hierarchical graph attention network for temporal knowledge graph reasoning".Neurocomputing (2023):126390.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace