Skip to content

Latest commit

 

History

History
11 lines (8 loc) · 2.38 KB

Li2020NeuralG.md

File metadata and controls

11 lines (8 loc) · 2.38 KB

Title

Neural Graph Embedding for Neural Architecture Search

Author

Wei Li, Shaogang Gong, Xiatian Zhu

Abstract

Existing neural architecture search (NAS) methods often operate in discrete or continuous spaces directly, which ignores the graphical topology knowledge of neural networks. This leads to suboptimal search performance and efficiency, given the factor that neural networks are essentially directed acyclic graphs (DAG). In this work, we address this limitation by introducing a novel idea of neural graph embedding (NGE). Specifically, we represent the building block (i.e. the cell) of neural networks with a neural DAG, and learn it by leveraging a Graph Convolutional Network to propagate and model the intrinsic topology information of network architectures. This results in a generic neural network representation integrable with different existing NAS frameworks. Extensive experiments show the superiority of NGE over the state-of-the-art methods on image classification and semantic segmentation.

Bib

@article{Li_Gong_Zhu_2020, title={Neural Graph Embedding for Neural Architecture Search}, volume={34}, url={https://ojs.aaai.org/index.php/AAAI/article/view/5903}, DOI={10.1609/aaai.v34i04.5903}, abstractNote={<p>Existing neural architecture search (NAS) methods often operate in discrete or continuous spaces <em>directly</em>, which ignores the <em>graphical topology knowledge</em> of neural networks. This leads to suboptimal search performance and efficiency, given the factor that neural networks are essentially <em>directed acyclic graphs</em> (DAG). In this work, we address this limitation by introducing a novel idea of <em>neural graph embedding</em> (NGE). Specifically, we represent the building block (i.e. the cell) of neural networks with a neural DAG, and learn it by leveraging a Graph Convolutional Network to propagate and model the intrinsic topology information of network architectures. This results in a generic neural network representation integrable with different existing NAS frameworks. Extensive experiments show the superiority of NGE over the state-of-the-art methods on image classification and semantic segmentation.</p>}, number={04}, journal={Proceedings of the AAAI Conference on Artificial Intelligence}, author={Li, Wei and Gong, Shaogang and Zhu, Xiatian}, year={2020}, month={Apr.}, pages={4707-4714} }