Journal of Systems Engineering and Electronics ›› 2023, Vol. 34 ›› Issue (2): 350-359.doi: 10.23919/JSEE.2023.000057

• SYSTEMS ENGINEERING • Previous Articles    

DHSEGATs: distance and hop-wise structures encoding enhanced graph attention networks

Zhiguo HUANG1,2,*()   

  1. 1 Sci-Tech Academy, Zhejiang University, Hangzhou 310030, China
    2 Post-Doctoral Research Center, Hundsun Incorporated, Hangzhou 310038, China
  • Received:2022-08-15 Online:2023-04-18 Published:2023-04-18
  • Contact: Zhiguo HUANG E-mail:hzg0601@163.com
  • About author:
    HUANG Zhiguo was born in 1989. He received his Ph.D. degree in actuarial science from Nankai University, Tianjin, China, in 2019. Since 2019, he has been a post-doctoral researcher at the Sci-Tech Academy of Zhejiang University and the Post-Doctoral Research Center of Hundsun Incorporated, Hangzhou, Zhejiang, China. He is the author of more than 10 articles. His research interests include graph neural network algorithms, network embedding, and graph neural networks for risk management. E-mail: hzg0601@163.com

Abstract:

Numerous works prove that existing neighbor-averaging graph neural networks (GNNs) cannot efficiently catch structure features, and many works show that injecting structure, distance, position, or spatial features can significantly improve the performance of GNNs, however, injecting high-level structure and distance into GNNs is an intuitive but untouched idea. This work sheds light on this issue and proposes a scheme to enhance graph attention networks (GATs) by encoding distance and hop-wise structure statistics. Firstly, the hop-wise structure and distributional distance information are extracted based on several hop-wise ego-nets of every target node. Secondly, the derived structure information, distance information, and intrinsic features are encoded into the same vector space and then added together to get initial embedding vectors. Thirdly, the derived embedding vectors are fed into GATs, such as GAT and adaptive graph diffusion network (AGDN) to get the soft labels. Fourthly, the soft labels are fed into correct and smooth (C&S) to conduct label propagation and get final predictions. Experiments show that the distance and hop-wise structures encoding enhanced graph attention networks (DHSEGATs) achieve a competitive result.

Key words: graph attention network (GAT), graph structure information, label propagation