SNEQ: Semi-Supervised Attributed Network Embedding with Attention-Based Quantisation

  • Tao He Monash University
  • Lianli Gao University of Electronic Science and Technology of China
  • Jingkuan Song University of Electronic Science and Technology of China
  • Xin Wang Tianjin University
  • Kejie Huang Zhejiang University
  • Yuanfang Li Monash University

Abstract

Learning accurate low-dimensional embeddings for a network is a crucial task as it facilitates many network analytics tasks. Moreover, the trained embeddings often require a significant amount of space to store, making storage and processing a challenge, especially as large-scale networks become more prevalent. In this paper, we present a novel semi-supervised network embedding and compression method, SNEQ, that is competitive with state-of-art embedding methods while being far more space- and time-efficient. SNEQ incorporates a novel quantisation method based on a self-attention layer that is trained in an end-to-end fashion, which is able to dramatically compress the size of the trained embeddings, thus reduces storage footprint and accelerates retrieval speed. Our evaluation on four real-world networks of diverse characteristics shows that SNEQ outperforms a number of state-of-the-art embedding methods in link prediction, node classification and node recommendation. Moreover, the quantised embedding shows a great advantage in terms of storage and time compared with continuous embeddings as well as hashing methods.

Published
2020-04-03
Section
AAAI Technical Track: Machine Learning