Neural Graph Embedding for Neural Architecture Search

Authors

  • Wei Li Queen Mary University of London
  • Shaogang Gong Queen Mary University of London
  • Xiatian Zhu University of Surrey

DOI:

https://doi.org/10.1609/aaai.v34i04.5903

Abstract

Existing neural architecture search (NAS) methods often operate in discrete or continuous spaces directly, which ignores the graphical topology knowledge of neural networks. This leads to suboptimal search performance and efficiency, given the factor that neural networks are essentially directed acyclic graphs (DAG). In this work, we address this limitation by introducing a novel idea of neural graph embedding (NGE). Specifically, we represent the building block (i.e. the cell) of neural networks with a neural DAG, and learn it by leveraging a Graph Convolutional Network to propagate and model the intrinsic topology information of network architectures. This results in a generic neural network representation integrable with different existing NAS frameworks. Extensive experiments show the superiority of NGE over the state-of-the-art methods on image classification and semantic segmentation.

Downloads

Published

2020-04-03

How to Cite

Li, W., Gong, S., & Zhu, X. (2020). Neural Graph Embedding for Neural Architecture Search. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 4707-4714. https://doi.org/10.1609/aaai.v34i04.5903

Issue

Section

AAAI Technical Track: Machine Learning