AAAI Publications, Thirty-First AAAI Conference on Artificial Intelligence

Font Size: 
Scalable Graph Embedding for Asymmetric Proximity
Chang Zhou, Yuqiong Liu, Xiaofei Liu, Zhongyi Liu, Jun Gao

Last modified: 2017-02-13

Abstract


Graph Embedding methods are aimed at mapping each vertex into a low dimensional vector space, which preserves certain structural relationships among the vertices in the original graph. Recently, several works have been proposed to learn embeddings based on sampled paths from the graph, e.g., DeepWalk, Line, Node2Vec. However, their methods only preserve symmetric proximities, which could be insufficient in many applications, even the underlying graph is undirected. Besides, they lack of theoretical analysis of what exactly the relationships they preserve in their embedding space. In this paper, we propose an asymmetric proximity preserving (APP) graph embedding method via random walk with restart, which captures both asymmetric and high-order similarities between node pairs. We give theoretical analysis that our method implicitly preserves the Rooted PageRank score for any two vertices. We conduct extensive experiments on tasks of link prediction and node recommendation on open source datasets, as well as online recommendation services in Alibaba Group, in which the training graph has over 290 million vertices and 18 billion edges, showing our method to be highly scalable and effective.

Keywords


Graph Embedding, Dimensionality Reduction, Recommendation

Full Text: PDF