Skip to content

Latest commit

 

History

History
480 lines (384 loc) · 37.3 KB

README.md

File metadata and controls

480 lines (384 loc) · 37.3 KB

该文是对以下三篇graph综述进行的整理, 主要是对GNN网络的分类、GNN在cv上的应用、以及细节的地方添加了自己的理解,可能会有些error,若有修改意见,可以提出!

  1. Deep Learning on Graphs: A Survey
  2. Graph Neural Networks: A Review of Methods and Applications
  3. A Comprehensive Survey on Graph Neural Networks

总结性质的

github上的某篇总结-介绍了相关的论文、博客、以及研究者

https://github.com/sungyongs/graph-based-nn
https://github.com/thunlp/GNNPapers
Spatio-temporal modeling 论文列表(主要是graph convolution相关) https://mp.weixin.qq.com/s/xgf7A3GFh1cIM2QhaCyyoA

综述论文 ★

  1. Deep Learning on Graphs: A Survey
    [新智元解读]
  2. Graph Neural Networks: A Review of Methods and Applications
    [新智元解读]
  3. A Comprehensive Survey on Graph Neural Networks
  4. Relational inductive biases, deep learning, and graph networks
  5. Geometric Deep Learning: Going beyond Euclidean data
  6. Computational Capabilities of Graph Neural Networks
  7. Neural Message Passing for Quantum Chemistry
  8. Non-local Neural Networks
  9. The Graph Neural Network Model

Library

  • geometric learning library [github] in PyTorch named PyTorch Geometric, which implements serveral graph neural networks including ChebNet, 1stChebNet, GraphSage, MPNNs, GAT and SplineCNN.
  • Deep Graph Library (DGL) [website] [github] provides a fast implementation of many graph neural networks with a set of functions on top of popular deep learning platforms such as PyTorch and MXNet
  • graph_nets [github]

谱上的图卷积发展:Spectral-Based Graph Convolutional Networks

  • 以下四篇是按照时间轴,依次在前一篇的文章上进行改进的 ★
  1. The Emerging Field of Signal Processing on Graphs
  2. Spectral Networks and Locally Connected Networks on Graphs
  3. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering, [PyTorch Code] [TF Code]
  4. Semi-Supervised Classification with Graph Convolutional Networks, [Code], [Blog]
  • 以下三篇是在"A Comprehensive Survey on Graph Neural Networks"这篇综述中提到的另外三篇
  1. Deep convolutional networks on graph-structured data
  2. Adaptive graph convolutional neural networks (AAAI 2018) 可接受任意图结构和规模的图作为输入
  3. Cayleynets: Graph convolutional neural networks with complex rational spectral filters
  • 谱上的图卷积网络的缺陷:
  • by "A Comprehensive Survey on Graph Neural Networks
  • spectral methods usually handle the whole graph simultaneously and are difficult to parallel or scale to large graphs (P2)
  • more drawbacks: -- (P7 4.1.3 summary of spectral methods)
  1. 任何对graph的扰动都可以导致特征基U(特征向量)的扰动
  2. 可学习的filter是与domain相关的,不能应用于不同的graph structure
  3. 特征值分解需要很大的计算量和存储量
  4. 虽然ChebNet and 1stChebNet定义的过滤器在空间上的局部的,且在graph上的任意位置(node)是共享的,但是这两个模型都需要载入整个graph进行graph convolution的计算,在处理big graph上计算效率低:by yaya: X'=AXW, X'的更新, 需要输入整个X才可以计算得到

空间上的图卷积:Spatial-Based Graph Convolutional Networks

  • by "A Comprehensive Survey on Graph Neural Networks"
  1. Graph Attention Network (GAT)(ICLR 2017) [tf code]
  2. Inductive representation learning on large graphs(GraphSAGE) [tf code]
    Instead of updating states over all nodes, GraphSage proposes a batch-training algorithm(sub-graph training)which improves scalability for large graphs. The learning process: P9 in "A Comprehensive Survey on Graph Neural Networks"
  3. Neural Message Passing for Quantum Chemistry (MPNNs)
  4. Learning convolutional neural networks for graphs (PATCHY-SAN)
  5. Geometric deep learning on graphs and manifolds using mixture model cnns
  6. Learning convolutional neural networks for graphs
  7. Large-scale learnable graph convolutional networks (LGCN) [tf code]
  8. Diffusion-convolutional neural networks (NeurIPS 2016) [tf code]
  9. Geometric deep learning on graphs and manifolds using mixture model cnns (CVPR 2017)
  10. etc: by "A Comprehensive Survey on Graph Neural Networks" P5;P7的表格分别列举了一些spatial-based GCN
  • Together with sampling strategies, the computation can be performed in a batch of nodes instead of the whole graph (GraphSAGE and LGCN)

谱上与空间GCN的比较:Comparison Between Spectral and Spatial Models

  • by "A Comprehensive Survey on Graph Neural Networks"
  • bridges: The graph convolution defined by 1stChebNet(semi-supervised GCN) is localized in space. It bridges the gap between spectral-based methods and spatial-based methods. -- (P2)
  • Drawbacks to spectral based models. We illustrate this in the following from three aspects, efficiency, generality and flexibility. -- (P11)
  1. Efficiency
    基于谱的模型 或者需要计算特征向量,或者需要同时处理整个graph,这样的情况下,模型的计算量将随着graph size 显著的增加
    基于空间的模型 通过聚合临近节点的特征,直接在graph domain进行卷积计算,因此具有处理large graph的潜力。另外,可以以批次处理节点,而不是整个graph。再另外,随着临近节点的增加,可以使用采样策略来提高效率----参见后文 改善GCN在训练方面的缺陷: Training Methods
  2. Generality
    基于谱的模型 假设在固定的graph上进行训练,很难泛化到其他的新的或者不同的graph上
    基于空间的模型 以node为单位, 执行graph convolution计算,因此训练得到的权重(weights)可以轻易的共享到其他的node或者graph
  3. Flexibility
    基于谱的模型 受限于无向图,但是却没有在有向图上的关于拉普拉斯矩阵(Laplacian matrix)清晰的定义。因此,若将基于谱的方法应用在有向图上,需要先将有向图转化为无向图
    基于空间的模型 处理多源输入更加灵活,这里的多源输入可以指:edge features or edge directions, etc
    关于edge features, 参见下文 输入含有边特征的GNN:input allow edge features

改善GCN在训练方面的缺陷: Training Methods ★

  • by "A Comprehensive Survey on Graph Neural Networks"
  • Comparison Between Spectral and Spatial Models -- (P11)
  • 1stChebNet(semi-supervised GCN):the main drawback of 1stChebNet is that the computation cost increases exponentially with the increase of the number of 1stChebNet layers during batch training. Each node in the last layer has to expand its neighborhood recursively across previous layers.
  1. Fastgcn: fast learning with graph convolutional networks via importance sampling (ICLR 2018)
    assume the rescaled adjacent matrix A comes from a sampling distribution.
  2. Stochastic training of graph convolutional networks with variance reduction (ICML 2018)
    reduce the receptive field size of the graph convolution to an arbitrary small scale by sampling neighborhoods and using historical hidden representations.
  3. Adaptive sampling towards fast graph representation learning (NeurIPS 2018)
    propose an adaptive layer-wise sampling approach to accelerate the training of 1stChebNet, where sampling for the lower layer is conditioned on the top one.
  • by "Graph Neural Networks: A Review of Methods and Applications"
  • Training Methods -- (P9)
  • GCN requires the full graph Laplacian, which is computational-consuming for large graphs. Furthermore, The embedding of a node at layer L is computed recursively by the embeddings of all its neighbors at layer L − 1. Therefore, the receptive field of a single node grows exponentially with respect to the number of layers, so computing gradient for a single node costs a lot. Finally, GCN is trained independently for a fixed graph, which lacks the ability for inductive learning.
  1. Inductive representation learning on large graphs (NeurIPS 2017)
  2. Fastgcn: fast learning with graph convolutional networks via importance sampling (ICLR 2018)
    directly samples the receptive field for each layer.
  3. Adaptive sampling towards fast graph representation learning (NeurIPS 2018)
    introduces a parameterized and trainable sampler to perform layerwise sampling conditioned on the former layer.
  4. Stochastic training of graph convolutional networks with variance reduction (ICML 2018)
    proposed a control-variate based stochastic approximation algorithms for GCN by utilizing the historical activations of nodes as a control variate.
  5. Deeper insights into graph convolutional networks for semi-supervised learning (arXiv:1801.07606, 2018)
  • by "Deep Learning on Graphs: A Survey"
  • Accelerating by Sampling -- (P8)
  1. Inductive representation learning on large graphs (NeurIPS 2017)
  2. Graph convolutional neural networks for web-scale recommender systems
  3. Fastgcn: fast learning with graph convolutional networks via importance sampling (ICLR 2018)
  4. Stochastic training of graph convolutional networks with variance reduction (ICML 2018)

Graph Attention Networks

  • by "A Comprehensive Survey on Graph Neural Networks"
  1. Graph Attention Network (GAT)(ICLR 2017) [tf code]
  2. Gaan:Gated attention networks for learning on large and spatiotemporal graphs
  3. Graph classification using structural attention(ACM SIGKDD 2018)
  4. Watch your step: Learning node embeddings via graph attention(NeurIPS 2018)

Gated Graph Neural Network

  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. Gated graph sequence neural networks (arXiv 2016)
  2. Improved semantic representations from tree-structured long short-term memory networks (IJCNLP 2015)
  3. Conversation modeling on reddit using a graph-structured lstm (TACL 2018)
  4. Sentence-state lstm for text representation (ACL 2018)
  5. Semantic object parsing with graph lstm (ECCV 2016)

Residual and Jumping Connections/Skip Connections

  • by yaya:考虑到CNN中residual network增加网络层数,使得性能的提升,这里尝试使用residual 也是为了在增加网络层数的基础上,使得性能更好。参见下文:Go deeper?
  • by "Deep Learning on Graphs: A Survey" -- (P7 Residual and Jumping Connections)
  1. Semi-supervised classification with graph convolutional networks (ICLR 2017)
  2. Column networks for collective classification (AAAI 2017)
  3. Representation learning on graphs with jumping knowledge networks (ICML 2018)
  • by "Graph Neural Networks: A Review of Methods and Applications" -- P9 Skip Connections
  1. Semi-supervised user geolocation via graph convolutional networks (ACL 2018)
  2. Representation learning on graphs with jumping knowledge networks (ICML 2018)

Graph Auto-encoders

  • by "A Comprehensive Survey on Graph Neural Networks"
  • network embedding算法可以分类为:1.matrix factorization 2.random walks 3. deep learning. Graph Auto-encoders是deep learning的一类方法. -- (P2)
  • Network embedding是为了将node embedding 转化到低维的向量空间,通过保存网络的拓扑结构与节点内容信息,接下来的graph分析任务(比如,分类,聚类和推荐等)可以被应用于现有的机器学习任务(如SVM for classification)
  1. Variational graph auto-encoders (GAE) [tkipf/code] [tf code]
    used in link prediction task in citation networks
    encoder对node embedding进行更新,decoder对A(adjacency matrix)进行更新
  2. Adversarially regularized graph autoencoder for graph embedding (ARGA) [tf code]
  3. Learning deep network representations with adversarially regularized autoencoders (NetRA)
  4. Deep neural networks for learning graph representations (DNGR) [matlab code]
  5. Structural deep network embedding (SDNE) [python code]
  6. Deep recursive network embedding with regular equivalence (DRNE)(https://github.com/tadpole/DRNE)

Graph Generative Networks ★

  • by "A Comprehensive Survey on Graph Neural Networks"
  • factor the generation process as forming nodes and edges alternatively
  1. Graphrnn: A deep generative model for graphs (ICML2018) [tf code]
  2. Learning deep generative models of graphs (ICML2018)
  • employ generative adversarial training
  1. Molgan: An implicit generative model for small molecular graphs (arXiv:1805.11973 2018)
  2. Net-gan: Generating graphs via random walks (ICML2018)

GCN Based Graph Spatial-Temporal Networks ★

  • by "A Comprehensive Survey on Graph Neural Networks"
  1. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting (DCRNN) (ICLR 2018)
    运用图卷积的思想提出了DCRNN来进行时间和空间上的交通流预测,并达到了很好的效果
  2. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting (CNN-GNN) (IJCAI 2017) [tf code
  3. Spatial temporal graph convolutional networks for skeleton-based action recognition (ST-GCN) (AAAI 2018) [pytorch code]
  4. Structural-rnn:Deep learning on spatio-temporal graphs (Structural-RNN) (CVPR 2016) [theano code]
  • by yaya
  • 这两篇都是skeleton-based action recognition ★
  1. Skeleton-Based Action Recognition with Spatial Reasoning and Temporal Stack Learning (ECCV 2018)
  2. Spatial temporal graph convolutional networks for skeleton-based action recognition (ST-GCN) (AAAI 2018) [pytorch code]

Graph Recurrent Neural Networks

  • by "Deep Learning on Graphs: A Survey"
  1. Graphrnn: Generating realistic graphs with deep auto-regressive models (ICML 2018)
  2. Dynamic graph neural networks (arXiv preprint 2018)
  3. Geometric matrix completion with recurrent multi-graph neural networks (NeurIPS 2017)
  4. Dynamic graph convolutional networks (arXiv preprint 2017)
    Dynamic GCN applies LSTM to gather results of GCNs of different time slices in dynamic networks, aiming to capture both spatio and temporal graph information.
  • by yaya
  1. Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks
  2. Structured Sequence Modeling with Graph Convolutional Recurrent Networks

Graph Reinforcement Learning

  • by "Deep Learning on Graphs: A Survey"
  1. Graph convolutional policy network for goal-directed molecular graph generation (NeurIPS 2018)
  2. Molgan: An implicit generative model for small molecular graphs (arXiv preprint 2018)

输入含有边特征的GNN:Input Allow Edge Features ★

  • by "A Comprehensive Survey on Graph Neural Networks"
  1. The graph neural network model(GNN) (2009)
  2. Neural message passing for quantum chemistry(MPNN) (2017)
  3. Diffusion-convolutional neural networks(DCNN) (2016)
  4. Learning convolutional neural networks for graphs(PATCHY-SAN) (2016)
  • by "Deep Learning on Graphs: A Survey"
  1. Geniepath:Graph neural networks with adaptive receptive paths
  2. Dual graph convolutional networks for graph-based semi-supervised classification
  3. Signed graph convolutional network
  • by yaya
  1. Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
  2. Exploring Visual Relationship for Image Captioning

图表达:Graph Level Representation/Readout Operations ★

Order invariance A critical requirement for the graph readout operation is that the operation should be invariant to the order of nodes, i.e. if we change the indices of nodes and edges using a bijective function between two vertex sets, representation of the whole graph should not change.

一. Statistics

  • by "Deep Learning on Graphs: A Survey"
  • The most basic operations that are order invariant are simple statistics like taking sum, average or max-pooling
  1. Convolutional networks on graphs for learning molecular fingerprints
  2. Diffusion-convolutional neural networks
  • other
  1. Molecular graph convolutions: moving beyond fingerprints
  2. Spectral networks and locally connected networks on graphs

二. Hierarchical Clustering

  • by "Deep Learning on Graphs: A Survey"
  1. Spectral networks and locally connected networks on graphs
  2. Deep convolutional networks on graph-structured data
  3. Hierarchical Graph Representation Learning with Differentiable Pooling [code]

三. Graph Pooling Modules

  • by "A Comprehensive Survey on Graph Neural Networks"
  1. Convolutional neural networks on graphs with fast localized spectral filtering (NeurIPS 2016)
  2. Deep convolutional networks on graph-structured data
  3. An end-to-end deep learning architecture for graph classification (AAAI 2018) [code] [pytorch code]
  4. Hierarchical graph representation learning with differentiable pooling (NeurIPS 2018) [code]

GCN的应用

计算机视觉 ★

  • Overview
  • by "Graph Neural Networks: A Review of Methods and Applications"

Scene graph generation

  • by "A Comprehensive Survey on Graph Neural Networks"
  • detect and recognize objects and predict semantic relationships between pairs of objects
  1. Scene graph generation by iterative message passing (CVPR 2017)
  2. Graph r-cnn for scene graph generation (ECCV 2018)
  3. Factorizable net: an efficient subgraph-based framework for scene graph generation (ECCV 2018)
  • generating realistic images given scene graphs
  • by "A Comprehensive Survey on Graph Neural Networks"
  1. Image generation from scene graphs (arXiv preprint, 2018)

Point clouds classification and segmentation

  • by "A Comprehensive Survey on Graph Neural Networks"
  1. Dynamic graph cnn for learning on point clouds(arXiv preprint 2018)
  2. Large-scale point cloud semantic segmentation with superpoint graphs (CVPR 2018)
  3. Rgcnn: Regularized graph cnn for point cloud segmentation (arXiv preprint 2018)

Action recognition ★

  • by "A Comprehensive Survey on Graph Neural Networks"
  • detects the locations of human joints in video clips
  1. Spatial temporal graph convolutional networks for skeleton-based action recognition (ST-GCN) (AAAI 2018) [pytorch code]
  2. Structural-rnn:Deep learning on spatio-temporal graphs (Structural-RNN) (CVPR 2016) [theano code]
  • by yaya
  1. Skeleton-Based Action Recognition with Spatial Reasoning and Temporal Stack Learning (ECCV 2018)

Image classification ★

  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. Few-shot learning with graph neural networks (ICLR 2018) [code]
  2. Zero-shot recognition via semantic embeddings and knowledge graphs (CVPR 2018)
  3. Multi-label zero-shot learning with structured knowledge graphs (arXiv preprint 2017)
  4. Rethinking knowledge graph propagation for zero-shot learning(arXiv preprint 2018)
  5. The more you know: Using knowledge graphs for image classification (arXiv preprint 2016)
  • by yaya
  1. Learning to Propagate Labels: Transductive Propagation Network for Few-shot Learning [tf code]

Few-shot

  • by "A Comprehensive Survey on Graph Neural Networks"
  • image classification
  1. Few-shot learning with graph neural networks (ICLR 2018) [code]
  • 3d action recognition
  1. Neural graph matching networks for fewshot 3d action recognition (ECCV 2018)
  • by yaya
  • image classification
  1. Learning to Propagate Labels: Transductive Propagation Network for Few-shot Learning [tf code]

Zero-shot

  1. Zero-shot recognition via semantic embeddings and knowledge graphs (CVPR 2018)
  2. Multi-label zero-shot learning with structured knowledge graphs (arXiv preprint 2017)
  3. Rethinking knowledge graph propagation for zero-shot learning(arXiv preprint 2018)

Semantic segmentation

  • by "A Comprehensive Survey on Graph Neural Networks"
  1. 3d graph neural networks for rgbd semantic segmentation (CVPR 2017)
  2. Syncspeccnn: Synchronized spectral cnn for 3d shape segmentation (CVPR 2017)
  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. Semantic object parsing with graph lstm (ECCV 2016)
  2. Interpretable structure-evolving lstm (CVPR 2017)
  3. Large-scale point cloud semantic segmentation with superpoint graphs(arXiv preprint 2017)
  4. Dynamic graph cnn for learning on point clouds(arXiv preprint 2018)
  5. 3d graph neural networks for rgbd semantic segmentation (CVPR 2017)

Visual question answer ★

  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. A simple neural network module for relational reasoning. Adam Santoro, David Raposo, David G.T. Barrett, Mateusz Malinowski, Razvan Pascanu, Peter Battaglia, Timothy Lillicrap. NeurIPS 2017. paper
  2. Graph-Structured Representations for Visual Question Answering. Damien Teney, Lingqiao Liu, Anton van den Hengel. CVPR 2017. paper
  3. Out of the Box: Reasoning with Graph Convolution Nets for Factual Visual Question Answering. Medhini Narasimhan, Svetlana Lazebnik, Alexander Schwing. NeurIPS 2018. paper
  4. Learning Conditioned Graph Structures for Interpretable Visual Question Answering. Will Norcliffe-Brown, Efstathios Vafeias, Sarah Parisot. NeurIPS 2018. paper [code]
  5. Deep reasoning with knowledge graph for social relationship understanding.

Object detection

  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. Relation networks for object detection (CVPR 2018)
  2. Learning region features for object detection (arXiv preprint 2018)

Interaction detection

  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. [Learning humanobject interactions by graph parsing neural networks] (arXiv preprint 2018)
  2. Structural-rnn:Deep learning on spatio-temporal graphs (CVPR 2016)

Region classification

  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. Iterative visual reasoning beyond convolutions (arXiv preprint 2018)

Social Relationship Understanding

  1. Deep reasoning with knowledge graph for social relationship understanding

自然语言处理

  • Overview
  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
  1. Semantic graph中的GCN公式,与X' = AXW 的形式是不一样的,仅能当成是一个节点特征的更新是由近邻节点的聚合特征得到-这样一般的gnn的形式,并不是像论文中说的相比于"semi-supervised gcn" more formally(又可能我没有读懂这个fromally的意思)
  2. 关于公式, Semantic graph首先由给定的句子通过StanfordCoreNLP生成句法依赖树,根据这个树,构建图

-解析公式: 首先node是句子中的每一个word, edge是句法依赖树生成的, edge是连接具有句法依赖的两个word, 同时, edge也有label (dependency label/syntactic functions), 如:'nsubj', 'advmod'等。如下图的一个例子,则公式中的W与相邻的节点有关, A与label of edge 有关

Other application

  • by "Graph Neural Networks: A Review of Methods and Applications"

by yaya: Papers I've read about the application of graph on cv and nlp ★★

  • Action recognition
  1. Non-local Neural Networks
  2. Nonlocal Neural Networks, Nonlocal Diffusion and Nonlocal Modeling(non-local 的升级版--未阅读--该文可能是没有具体application)
  3. Videos as Space-Time Region Graphs
  • Few-shot image classification
  1. Few-Shot Learning with Graph Neural Networks
  • Image captioning Exploring Visual Relationship for Image Captioning
  • Semantic role labeling
  1. Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling

Open problems and future direction ★

一. Go deeper?

  • by "Graph Neural Networks: A Review of Methods and Applications" & "A Comprehensive Survey on Graph Neural Networks"
    (1)当前的gnn的层数大都很浅,这是因为,随着网络层数的增加,representation of nodes将趋于平滑,换句话说,图卷积本质上是使相邻节点的表达更加接近,从而在理论上来说,在无限次卷积的情况下,所有节点的表达都将会收敛于一个稳定的点,节点特征的可区分性与信息的丰富性将会损失。在图结构数据上的网络增加层数是否是一个好的策略仍然是一个开放性的问题。[Deeper insights into graph convolutional networks for semi-supervised learning]
    (2)when tack k layers, each node will aggregate more information from neighborhoods k hops away. 若临近节点有噪声,将会随着层数的增加,噪声信息也会指数级增加。 P9 by "Graph Neural Networks: A Review of Methods and Applications"--skip connection
    受到传统deep neural networks在增加网络深度上取得的显著结果,一些研究者也尝试解决GNN中的网络层数难以加深的问题:
  • solution:
  • by "A Comprehensive Survey on Graph Neural Networks"
  1. Gated graph sequence neural networks (arXiv 2016)
  2. Deeper insights into graph convolutional networks for semi-supervised learning (arXiv preprint 2018)
  • by "Graph Neural Networks: A Review of Methods and Applications"
  1. Semi-supervised user geolocation via graph convolutional networks (ACL 2018)
  2. Representation learning on graphs with jumping knowledge networks (ICML 2018)

二. Non-structural Scenarios->generate graph from raw data

  • by "Graph Neural Networks: A Review of Methods and Applications"
  • 虽然上文讨论了graph在非结构化场景(image, text)中的应用,但是目前却没有从原始数据中来生成graph的最优的方法。in image domain, 一些工作利用CNN来获取特征映射,然后对其进行采样得到的超像素作为节点,其他的也有提取object作为节点。in test domain, 一些工作利用syntactic trees作为syntactic graphs, 另外其他的工作直接采用全连接graphs
    因此找到最佳的graph generation approach将提供更广泛的领域,GNN可以在这些领域中做出贡献。

三. Dynamic graphs

  • by "Deep Learning on Graphs: A Survey" 在社交网络中,存在新的人加入,或者已存在的人退出社交网络,这样的graph是动态的,而当前提出的方法都是建立在 static graph.
    How to model the evolving characteristics of dynamic graphs and support incrementally updating model parameters largely remains open in the literature.
  • Some preliminary works try to tackle this problem using Graph RNN architectures with encouraging results
  • solution:
  1. Dynamic graph neural networks (arXiv preprint 2018)
  2. Dynamic graph convolutional networks (arXiv preprint 2017)

四. Different types of graphs

  • by "Deep Learning on Graphs: A Survey"
  • solution:
  • homogeneous graphs Heterogeneous network embedding via deep architectures
  • Signed networks Signed graph convolutional network
  • Hyper graphs Structural deep embedding for hyper-networks (AAAI 2018)

五. Interpretability

  • by "Deep Learning on Graphs: A Survey"
  • 由于graph经常与其他学科相关联,因此解释图形的深度学习模型对于决策问题至关重要,例如,在药物或者疾病相关的问题,可解释性对于将计算机实验转化为临床应用至关重要。然而,由于图中的节点和边缘是紧密相互关联的, 因此基于图形的深度学习的可解释性甚至比其他黑匣子模型更具挑战性

六. Compositionality

  • by "Deep Learning on Graphs: A Survey"
  • 很多存在的方法可以组合到一起,例如将GCN作为GAEs或者Graph RNNs里的layer, 除了设计新的building blocks,如何将现有的结构以某种原则组合到一起也是一个很有趣的方向,最近的工作,Graph Networks进行了尝试,重点介绍了GNNS和GCNS通用框架在关系推理问题中的应用。

七. Scalability->Can gnn handle large graphs?

  • by "Graph Neural Networks: A Review of Methods and Applications"
    Scaling up GNN is difficult because many of the core steps are computational consuming in big data environment:1. graph不是规则的欧式空间,receptive filed(neighborhood structure) 对于每个node也是不同的,因此很难对节点进行批次训练. 2. 当处理 large graph时,计算graph Laplacian也很困难.

  • by yaya 我觉得这样的说法是不对的,由上文的分析中可以看出,只是谱方法需要计算graph Laplacian

  • by "A Comprehensive Survey on Graph Neural Networks"
    当gcn的堆叠多层时,一个节点的最终状态将由很多临近节点((1~k)-hop neighbors)的状态所决定,在反向传播时的计算量将会很大。当前为了提高模型的效率提出了两类方法fast sampling and sub-graph training, but still not scalable enough to handle deep architectures with large graphs

  • solution:
    fast sampling

  1. Fastgcn: fast learning with graph convolutional networks via importance sampling (ICLR 2018)
  2. Stochastic training of graph convolutional networks with variance reduction (ICML 2018)
    sub-graph training
  3. Inductive representation learning on large graphs (NeurIPS 2017)
  4. Large-scale learnable graph convolutional networks (ACM 2018)
  • by yaya: 我觉得这样说,是从deep gnn的角度来说,这样就没有讲清shallow gnn是否可以应用于large graph

  • yaya conclution: 基于公式X'=AXW的GCN网络,需要将entire graph输入网络中进行计算,不能以节点为单位进行batch运算,计算量大,对于设计了sub-graph的网络,局限性可能在于邻近节点很多,若网络也很深,计算量将也会很大

八. Receptive Field

  • by "A Comprehensive Survey on Graph Neural Networks"
  • 这里的Receptive Field是参考了论文"Deep Learning on Graphs: A Survey"中的Accelerating by Sampling这一节,目的也是在于加速训练
  • 一个node的可接受域是指它本身以及its neighbors, But the number of neighbors is very different, from one to thousands. 遵循power law distribution. 因此采样策略被提出来,如何选择节点的有代表性的接收域仍有待探索
  • solution:
  1. Inductive representation learning on large graphs (NeurIPS 2017)
  2. Learning convolutional neural networks for graphs (ICML 2016)
  3. Large-scale learnable graph convolutional networks (ACM SIGKDD 2018)

未提到的文章 ★