Graph Convolutional Network Based on Manifold Similarity Learning

In the area of large-scale graph data representation and semi-supervised learning, deep graph-based convolutional neural networks have been widely applied. However, typical graph convolutional network (GCN) aggregates information of neighbor nodes based on binary neighborhood similarity (adjacency matrix). It treats all neighbor nodes of one node equally, which does not suppress the influence of dissimilar neighbor nodes. In this paper, we investigate GCN based on similarity matrix instead of adjacency matrix of graph nodes. Gaussian heat kernel similarity in Euclidean space is first adopted, which is named EGCN. Then biologically inspired manifold similarity is trained in reproducing kernel Hilbert space (RKHS), based on which a manifold GCN (named MGCN) is proposed for graph data representation and semi-supervised learning with four different kernel types. The proposed method is evaluated with extensive experiments on four benchmark document citation network datasets. The objective function of manifold similarity learning converges very quickly on different datasets using various kernel functions. Compared with state-of-the-art methods, our method is very competitive in terms of graph node recognition accuracy. In particular, the recognition rates of MGCN (Gaussian kernel) and MGCN (Polynomial Kernel) outperform that of typical GCN about 3.8% on Cora dataset, 3.5% on Citeseer dataset, 1.3% on Pubmed dataset and 4% on Cora_ML dataset, respectively. Although the proposed MGCN is relatively simple and easy to implement, it can discover local manifold structure by manifold similarity learning and suppress the influence of dissimilar neighbor nodes, which shows the effectiveness of the proposed MGCN.

[1]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[2]  Hai Tao,et al.  Review of deep convolution neural network in image classification , 2017, 2017 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET).

[3]  Hossein Mobahi,et al.  Deep Learning via Semi-supervised Embedding , 2012, Neural Networks: Tricks of the Trade.

[4]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[5]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[6]  Yongli Hu,et al.  Sparse and Low-Rank Subspace Data Clustering with Manifold Regularization Learned by Local Linear Embedding , 2018, Applied Sciences.

[7]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[8]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[9]  Pravesh Kothari,et al.  An Analysis of the t-SNE Algorithm for Data Visualization , 2018, COLT.

[10]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.

[11]  Zoubin Ghahramani,et al.  Combining active learning and semi-supervised learning using Gaussian fields and harmonic functions , 2003, ICML 2003.

[12]  Lise Getoor,et al.  Collective Classification in Network Data , 2008, AI Mag..

[13]  Ruslan Salakhutdinov,et al.  Revisiting Semi-Supervised Learning with Graph Embeddings , 2016, ICML.

[14]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[15]  Bin Luo,et al.  Similarity Learning of Manifold Data , 2015, IEEE Transactions on Cybernetics.

[16]  Gen-ichirô Sunouchi On Mercer's Theorem , 1946 .

[17]  Stephan Günnemann,et al.  Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking , 2017, ICLR.