×

TPNE: topology preserving network embedding. (English) Zbl 1453.68144

Summary: Network embedding aims at mapping nodes into a vectorial feature space while maximally preserving the topological relations of nodes in a network so as to facilitate complex network analysis. The existing works in network embedding are focused on preserving local or global topological information within limited step-sizes, which could be insufficient in many applications, even the underlying network is undirected and unweighted. The complex and rich topological information in networks exerts paramount impacts on the formation of networks and can reveal the high-order relevance among different nodes. In this paper, we propose a novel network embedding framework based on deep neural networks, named Topology Preserving Network Embedding (TPNE), which is suitable for arbitrary types of information networks: directed or undirected, weighted or unweighted. In this framework, we devise a closeness matrix to capture more comprehensive global topology of the network, and combine both global closeness reconstruction and local neighborhood preserving into a single loss function. To justify our model, we conduct extensive experiments on node classification, link prediction and node visualization tasks, employing the learned embedding vectors. It demonstrates that the proposed approach has promising performance and it outperforms the existing state-of-the-art approaches on several networks in such tasks.

MSC:

68R10 Graph theory (including graph drawing) in computer science
05C82 Small world graphs, complex networks (graph-theoretic aspects)
68T07 Artificial neural networks and deep learning
Full Text: DOI

References:

[1] Žalik, K. R.; Žalik, B., Memetic algorithm using node entropy and partition entropy for community detection in networks, Inf. Sci., 445, 38-49 (2018)
[2] Pio, G.; Serafino, F.; Malerba, D.; Ceci, M., Multi-type clustering and classification from heterogeneous networks, Inf. Sci., 425, 107-126 (2018)
[3] Serafino, F.; Pio, G.; Ceci, M., Ensemble learning for multi-type classification in heterogeneous networks, IEEE Trans. Knowl. Data Eng., 30, 12, 2326-2339 (2018)
[4] Ko, Y. Y.; Cho, K. J.; Kim, S. W., Efficient and effective influence maximization in social networks: a hybrid-approach, Inf. Sci., 465, 144-161 (2018) · Zbl 1441.91060
[5] Ma, J.; Chow, T. W., Robust non-negative sparse graph for semi-supervised multi-label learning with missing labels, Inf. Sci., 422, 336-351 (2018) · Zbl 1436.68311
[6] He, K.; Li, Y.; Soundarajan, S.; Hopcroft, J. E., Hidden community detection in social networks, Inf. Sci., 425, 92-106 (2018)
[7] Luo, D.; Ding, C.; Nie, F.; Huang, H., Cauchy graph embedding, Proceedings of the 28th International Conference on Machine Learning, Bellevue, Washington, USA, 553-560 (2011)
[8] Perozzi, B.; Al-Rfou, R.; Skiena, S., DeepWalk: online learning of social representations, Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining,New York, NY, USA, 701-710 (2014)
[9] Grover, A.; Leskovec, J., node2vec: Scalable feature learning for networks, Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 855-864 (2016), ACM
[10] Perozzi, B.; Kulkarni, V.; Chen, H.; Skiena, S., Don’t walk, skip!: online learning of multi-scale network embeddings, Proceedings of the 2017 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, 258-265 (2017), ACM
[11] S. Abu-El-Haija, A. Kapoor, B. Perozzi, J. Lee, N-GCN: Multi-scale graph convolution for semi-supervised node classification, 2018, arXiv:1802.08888.
[12] Mikolov, T.; Sutskever, I.; Chen, K.; Corrado, G. S.; Dean, J., Distributed representations of words and phrases and their compositionality, Advances in Neural Information Processing Systems, Lake Tahoe, Nevada, USA, 3111-3119 (2013)
[13] Tang, J.; Qu, M.; Wang, M.; Zhang, M.; Yan, J.; Mei, Q., Line: Large-scale information network embedding, Proceedings of the 24th International Conference on World Wide Web, Florence, Italy, 1067-1077 (2015)
[14] Cao, S.; Lu, W.; Xu, Q., GraRep: learning graph representations with global structural information, Proceedings of the 24th ACM International on Conference on Information and Knowledge Management,Melbourne, Australia, 891-900 (2015)
[15] Kipf, T. N.; Welling, M., Semi-supervised classification with graph convolutional networks, Proceedings of the International Conference on Representation Learning, Palais des Congrs Neptune, Toulon, France (2017)
[16] Wang, H.; Wang, J.; Wang, J.; Zhao, M.; Zhang, W.; Zhang, F.; Xie, X.; Guo, M., GraphGAN: graph representation learning with generative adversarial nets, Proceedings of the 32nd AAAI Conference on Artificial Intelligence, New Orleans, LA, USA (2018)
[17] Vincent, P.; Larochelle, H.; Bengio, Y.; Manzagol, P.-A., Extracting and composing robust features with denoising autoencoders, Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland, 1096-1103 (2008)
[18] Mikolov, T.; Corrado, G.; Chen, K.; Dean, J.; Mikolov, T.; Corrado, G.; Chen, K.; Dean, J., Efficient estimation of word representations in vector space, Proceedings of the International Conference on Representation Learning, Scottsdale, Arizona, USA, 1-12 (2013)
[19] Yu, W.; Zheng, C.; Cheng, W.; Aggarwal, C. C.; Song, D.; Zong, B.; Chen, H.; Wang, W., Learning deep network representations with adversarially regularized autoencoders, Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2663-2671 (2018), ACM
[20] Tu, C.; Zhang, W.; Liu, Z.; Sun, M., Max-margin deepwalk: Discriminative learning of network representation, Proceedings of the 25th International Joint Conference on Artificial Intelligence, New York, USA, 3889-3895 (2016)
[21] Cao, S.; Lu, W.; Xu, Q., Deep neural networks for learning graph representations, Proceedings of the 30th AAAI Conference on Artificial Intelligence, Phoenix, Arizona, USA, 1145-1152 (2016)
[22] Wang, D.; Cui, P.; Zhu, W., Structural deep network embedding, Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 1225-1234 (2016), ACM
[23] Dai, Q.; Li, Q.; Tang, J.; Wang, D., Adversarial network embedding, Proceedings of the 32nd AAAI Conference on Artificial Intelligence, New Orleans, LA, USA (2018)
[24] Zheng, Y.; Chen, S.; Xue, Y.; Xue, J., A pythagorean-type fuzzy deep denoising autoencoder for industrial accident early warning, IEEE Trans. Fuzzy Syst., 25, 6, 1561-1575 (2017)
[25] Deng, J.; Zhang, Z.; Marchi, E., Sparse autoencoder-based feature transfer learning for speech emotion recognition, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, Geneva, Switzerland, 511-516 (2013)
[26] Vincent, P.; Larochelle, H.; Lajoie, I.; Bengio, Y.; Manzagol, P. A., Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion, J. Mach. Learn. Res., 11, 12, 3371-3408 (2010) · Zbl 1242.68256
[27] Bottou, L., Large-scale machine learning with stochastic gradient descent, Proceedings of COMPSTAT’2010, 177-186 (2010) · Zbl 1436.68293
[28] Sen, P.; Namata, G.; Bilgic, M.; Getoor, L.; Galligher, B.; Eliassi-Rad, T., Collective classification in network data, AI Mag., 29, 3, 93 (2008)
[29] Yang, T.; Jin, R.; Chi, Y.; Zhu, S., Combining link and content for community detection: a discriminative approach, Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining,Paris, France, 927-936 (2009)
[30] Tang, L.; Liu, H., Relational learning via latent social dimensions, Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining,Paris, France, 817-826 (2009)
[31] Schütze, H.; Manning, C. D.; Raghavan, P., Introduction to information retrieval, vol. 39 (2008), Cambridge University Press · Zbl 1160.68008
[32] Maaten, L. V.D.; Hinton, G., Visualizing data using t-SNE, J. Mach. Learn. Res., 9, 2605, 2579-2605 (2008) · Zbl 1225.68219
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.