HGATE: Heterogeneous Graph Attention Auto-Encoders

Wei WANG, Xiangyu WEI, Xiaoyang SUO, Bin WANG, Hao WANG, Hong-ning DAI, Xiangliang ZHANG

Research output: Journal PublicationsJournal Article (refereed)peer-review

14 Citations (Scopus)

Abstract

Graph auto-encoder is considered a framework for unsupervised learning on graph-structured data by representing graphs in a low dimensional space. It has been proved very powerful for graph analytics. In the real world, complex relationships in various entities can be represented by heterogeneous graphs that contain more abundant semantic information than homogeneous graphs. In general, graph auto-encoders based on homogeneous graphs are not applicable to heterogeneous graphs. In addition, little work has been done to evaluate the effect of different semantics on node embedding in heterogeneous graphs for unsupervised graph representation learning. In this work, we propose a novel Heterogeneous Graph Attention Auto-Encoders (HGATE) for unsupervised representation learning on heterogeneous graph-structured data. Based on the consideration of semantic information, our architecture of HGATE reconstructs not only the edges of the heterogeneous graph but also node attributes, through stacked encoder/decoder layers. Hierarchical attention is used to learn the relevance between a node and its meta-path based neighbors, and the relevance among different meta-paths. HGATE is applicable to transductive learning as well as inductive learning. Node classification and link prediction experiments on real-world heterogeneous graph datasets demonstrate the effectiveness of HGATE for both transductive and inductive tasks.
Original languageEnglish
Pages (from-to)3938-3951
Number of pages14
JournalIEEE Transactions on Knowledge and Data Engineering
Volume35
Issue number4
Early online date28 Dec 2021
DOIs
Publication statusPublished - 1 Apr 2023

Bibliographical note

Publisher Copyright:
© 1989-2012 IEEE.

Funding

The work was supported in part by the National Key R&D Program of China under Grant 2020YFB2103802, in part by the National Natural Science Foundation of China under Grant U21A20463, in part by the Fundamental Research Funds for the Central Universities under Grant KKJB320001536, and in part by Macao Science and Technology Development Fund under Macao Funding Scheme for Key R & D Projects under Grant 0025/2019/AKP, and in part by Research Initiation Project of Zhejiang Lab under Grant 113012-PI2013.

Keywords

  • Graph embedding representation
  • heterogeneous graphs
  • hierarchical attention
  • inductive learning
  • transductive learning

Fingerprint

Dive into the research topics of 'HGATE: Heterogeneous Graph Attention Auto-Encoders'. Together they form a unique fingerprint.

Cite this