×

A comprehensive survey on deep graph representation learning methods. (English) Zbl 07781075

Summary: There has been a lot of activity in graph representation learning in recent years. Graph representation learning aims to produce graph representation vectors to represent the structure and characteristics of huge graphs precisely. This is crucial since the effectiveness of the graph representation vectors will influence how well they perform in subsequent tasks like anomaly detection, connection prediction, and node classification. Recently, there has been an increase in the use of other deep-learning breakthroughs for data-based graph problems. Graph-based learning environments have a taxonomy of approaches, and this study reviews all their learning settings. The learning problem is theoretically and empirically explored. This study briefly introduces and summarizes the Graph Neural Architecture Search (G-NAS), outlines several Graph Neural Networks’ drawbacks, and suggests some strategies to mitigate these challenges. Lastly, the study discusses several potential future study avenues yet to be explored.

MSC:

68T07 Artificial neural networks and deep learning
Full Text: DOI

References:

[1] Abu-El-Haija, S., Kapoor, A., Perozzi, B., & Lee, J. (2020). N-gcn: Multi-scale graph convolution for semi-supervised node classification. Uncertainty in Artificial Intelligence, 841-851.
[2] Adhikari, B., Zhang, Y., Ramakrishnan, N., & Prakash, B. A. (2018). Sub2vec: Feature learning for subgraphs. Pacific-Asia Conference on Knowledge Discovery and Data Mining, 170-182.
[3] Agafonov, A., & Myasnikov, V. (2021). Short-term Traffic Flow Prediction in a Partially Connected Vehicle Environment. 2021 3rd International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), 968-972.
[4] Agrawal, A., Ali, A., Boyd, S., & others. (2021). Minimum-distortion embedding. Foundations and Trends®in Machine Learning, 14(3), 211-378.
[5] Ahmad, A., Ullah, A., Feng, C., Khan, M., Ashraf, S., Adnan, M., Nazir, S., & Khan, H. U. (2020). Towards an improved energy efficient and end-to-end secure protocol for iot healthcare applications. Security and Communication Networks, 2020, 1-10.
[6] Allab, K., Labiod, L., & Nadif, M. (2016). A semi-NMF-PCA unified framework for data clustering. IEEE Transactions on Knowledge and Data Engineering, 29(1), 2-16.
[7] Alon, U., & Yahav, E. (2020). On the bottleneck of graph neural networks and its practical implications. ArXiv Preprint ArXiv:2006.05205.
[8] Ashraf, S., Ahmed, T., & Saleem, S. (2021). NRSM: Node redeployment shrewd mechanism for wireless sensor network. Iran Journal of Computer Science, 4(3), 171-183.
[9] Ashraf, S., Saleem, S., Chohan, A. H., Aslam, Z., & Raza, A. (2020). Challenging strategic trends in green supply chain management. Int. J. Res. Eng. Appl. Sci. JREAS, 5(2), 71-74.
[10] Azizian, W., & Lelarge, M. (2020). Expressive power of invariant and equivariant graph neural networks. ArXiv Preprint ArXiv:2006.15646.
[11] Bai, J., Zhu, J., Song, Y., Zhao, L., Hou, Z., Du, R., & Li, H. (2021). A3t-gcn: Attention temporal graph convolutional network for traffic forecasting. ISPRS International Journal of Geo-Information, 10(7), 485.
[12] Balcilar, M., Héroux, P., Gauzere, B., Vasseur, P., Adam, S., & Honeine, P. (2021). Breaking the limits of message passing graph neural networks. International Conference on Machine Learning, 599-608.
[13] Bastings, J., Titov, I., Aziz, W., Marcheggiani, D., & Sima’an, K. (2017). Graph convolutional encoders for syntax-aware neural machine translation. ArXiv Preprint ArXiv:1704.04675.
[14] Belkin, M., & Niyogi, P. (2001). Laplacian eigenmaps and spectral techniques for embedding and clustering. Advances in Neural Information Processing Systems, 14.
[15] Belkin, M., & Niyogi, P. (2003). Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6), 1373-1396. · Zbl 1085.68119
[16] Belkin, M., Niyogi, P., & Sindhwani, V. (2006). Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 7(11). · Zbl 1222.68144
[17] Berthelot, D., Carlini, N., Goodfellow, I., Papernot, N., Oliver, A., & Raffel, C. A. (2019). Mixmatch: A holistic approach to semi-supervised learning. Advances in Neural Information Processing Systems, 32.
[18] Berton, L., de Paulo Faleiros, T., Valejo, A., Valverde-Rebaza, J., & de Andrade Lopes, A. (2017). Rgcli: Robust graph that considers labeled instances for semi-supervised learning. Neurocomputing, 226, 238-248.
[19] Berton, L., & Lopes, A. D. A. (2014). Graph construction based on labeled instances for semi-supervised learning. 2014 22nd International Conference on Pattern Recognition, 2477-2482.
[20] Besta, M., Peter, E., Gerstenberger, R., Fischer, M., Podstawski Michałand Barthels, C., Alonso, G., & Hoefler, T. (2019). Demystifying graph databases: Analysis and taxonomy of data organization, system designs, and graph queries. ArXiv Preprint ArXiv:1910.09017.
[21] Bing, H., Zhifeng, X., Yangjie, X., Jinxing, H., & Zhanwu, M. (2020). Integrating semantic zoning information with the prediction of road link speed based on taxi GPS data. Complexity, 2020.
[22] Bogaerts, T., Masegosa, A. D., Angarita-Zapata, J. S., Onieva, E., & Hellinckx, P. (2020). A graph CNN-LSTM neural network for short and long-term traffic forecasting based on trajectory data. Transportation Research Part C: Emerging Technologies, 112, 62-77.
[23] Bojchevski, A., & Günnemann, S. (2017). Deep gaussian embedding of graphs: Unsupervised inductive learning via ranking. ArXiv Preprint ArXiv:1707.03815.
[24] Bojchevski, A., Klicpera, J., Perozzi, B., Kapoor, A., Blais, M., Rózemberczki, B., Lukasik, M., & Günnemann, S. (2020). Scaling graph neural networks with approximate pagerank. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2464-2473.
[25] Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., & Yakhnenko, O. (2013). Translating embeddings for modeling multi-relational data. Advances in Neural Information Processing Systems, 26.
[26] Borgwardt, K. M., & Kriegel, H.-P. (2005). Shortest-path kernels on graphs. Fifth IEEE International Conference on Data Mining (ICDM’05), 8-pp.
[27] Brody, S., Alon, U., & Yahav, E. (2021). How attentive are graph attention networks? ArXiv Preprint ArXiv:2105.14491.
[28] Cai, L., Janowicz, K., Mai, G., Yan, B., & Zhu, R. (2020). Traffic transformer: Capturing the continuity and periodicity of time series for traffic forecasting. Transactions in GIS, 24(3), 736-755.
[29] Cai, S., Li, L., Deng, J., Zhang, B., Zha, Z.-J., Su, L., & Huang, Q. (2021). Rethinking graph neural architecture search from message-passing. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 6657-6666.
[30] Cao, J., Lin, X., Guo, S., Liu, L., Liu, T., & Wang, B. (2021). Bipartite graph embedding via mutual information maximization. Proceedings of the 14th ACM International Conference on Web Search and Data Mining, 635-643.
[31] Cao, S., Lu, W., & Xu, Q. (2015). Grarep: Learning graph representations with global structural information. Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, 891-900.
[32] Cao, S., Lu, W., & Xu, Q. (2016). Deep neural networks for learning graph representations. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1).
[33] Caron, M., Bojanowski, P., Joulin, A., & Douze, M. (2018). Deep clustering for unsupervised learning of visual features. Proceedings of the European Conference on Computer Vision (ECCV), 132-149.
[34] Chamberlain, B., Rowbottom, J., Gorinova, M. I., Bronstein, M., Webb, S., & Rossi, E. (2021). Grand: Graph neural diffusion. International Conference on Machine Learning, 1407-1418.
[35] Chami, I., Abu-El-Haija, S., Perozzi, B., Ré, C., & Murphy, K. (2022). Machine learning on graphs: A model and comprehensive taxonomy. The Journal of Machine Learning Research, 23(1), 3840-3903.
[36] Che, F., Yang, G., Zhang, D., Tao, J., & Liu, T. (2021). Self-supervised graph representation learning via bootstrapping. Neurocomputing, 456, 88-96.
[37] Chen, C., Tao, Y., & Lin, H. (2019). Dynamic network embeddings for network evolution analysis. ArXiv Preprint ArXiv:1906.09860.
[38] Chen, C., Wu, Y., Dai, Q., Zhou, H.-Y., Xu, M., Yang, S., Han, X., & Yu, Y. (2022). A survey on graph neural networks and graph transformers in computer vision: a task-oriented perspective. ArXiv Preprint ArXiv:2209.13232.
[39] Chen, F., Wang, Y.-C., Wang, B., & Kuo, C.-C. J. (2020). Graph representation learning: a survey. APSIPA Transactions on Signal and Information Processing, 9, e15.
[40] Chen, H., Perozzi, B., Hu, Y., & Skiena, S. (2018). Harp: Hierarchical representation learning for networks. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1).
[41] Chen, J., Liao, S., Hou, J., Wang, K., & Wen, J. (2020). GST-GCN: A Geographic-Semantic-Temporal Graph Convolutional Network for Context-aware Traffic Flow Prediction on Graph Sequences. 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 1604-1609.
[42] Chen, J., Ma, T., & Xiao, C. (2018). Fastgcn: fast learning with graph convolutional networks via importance sampling. ArXiv Preprint ArXiv:1801.10247.
[43] Chen, L.-Z., Lin, Z., Wang, Z., Yang, Y.-L., & Cheng, M.-M. (2021). Spatial information guided convolution for real-time rgbd semantic segmentation. IEEE Transactions on Image Processing, 30, 2313-2324.
[44] Chen, M., Wei, Z., Ding, B., Li, Y., Yuan, Y., Du, X., & Wen, J.-R. (2020). Scalable graph neural networks via bidirectional propagation. Advances in Neural Information Processing Systems, 33, 14556-14566.
[45] Chen, M., Wei, Z., Huang, Z., Ding, B., & Li, Y. (2020). Simple and deep graph convolutional networks. International Conference on Machine Learning, 1725-1735.
[46] Chen, T., Zhou, K., Duan, K., Zheng, W., Wang, P., Hu, X., & Wang, Z. (2022). Bag of tricks for training deeper graph neural networks: A comprehensive benchmark study. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(3), 2769-2781.
[47] Chen, X., Zhang, Y., Du, L., Fang, Z., Ren, Y., Bian, K., & Xie, K. (2020). Tssrgcn: Temporal spectral spatial retrieval graph convolutional network for traffic flow forecasting. 2020 IEEE International Conference on Data Mining (ICDM), 954-959.
[48] Chen, Z., Villar, S., Chen, L., & Bruna, J. (2019). On the equivalence between graph isomorphism testing and function approximation with gnns. Advances in Neural Information Processing Systems, 32.
[49] Chiang, W.-L., Liu, X., Si, S., Li, Y., Bengio, S., & Hsieh, C.-J. (2019). Cluster-gcn: An efficient algorithm for training deep and large graph convolutional networks. Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 257-266.
[50] Chien, E., Peng, J., Li, P., & Milenkovic, O. (2020). Adaptive universal generalized pagerank graph neural network. ArXiv Preprint ArXiv:2006.07988.
[51] Choudhary, N., Rao, N., Katariya, S., Subbian, K., & Reddy, C. K. (2021). Self-supervised hyperboloid representations from logical queries over knowledge graphs. Proceedings of the Web Conference 2021, 1373-1384.
[52] Cui, G., Zhou, J., Yang, C., & Liu, Z. (2020). Adaptive graph encoder for attributed graph embedding. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 976-985.
[53] Cui, S., Yu, B., Liu, T., Zhang, Z., Wang, X., & Shi, J. (2020). Edge-enhanced graph convolution networks for event detection with syntactic relation. ArXiv Preprint ArXiv:2002.10757.
[54] Dash, T., Chitlangia, S., Ahuja, A., & Srinivasan, A. (2022). A review of some techniques for inclusion of domain-knowledge into deep neural networks. Scientific Reports, 12(1), 1-15.
[55] Dasoulas, G., Santos, L. Dos, Scaman, K., & Virmaux, A. (2019). Coloring graph neural networks for node disambiguation. ArXiv Preprint ArXiv:1912.06058.
[56] Daud, N. N., Ab Hamid, S. H., Saadoon, M., Sahran, F., & Anuar, N. B. (2020). Applications of link prediction in social networks: A review. Journal of Network and Computer Applications, 166, 102716.
[57] Defferrard, M., Bresson, X., & Vandergheynst, P. (2016). Convolutional neural networks on graphs with fast localized spectral filtering. Advances in Neural Information Processing Systems, 29.
[58] Deng, W., Zhang, B., Zou, W., Zhang, X., Cheng, X., Guan, L., Lin, Y., Lao, G., Ye, B., Li, X., & others. (2019). Abnormal degree centrality associated with cognitive dysfunctions in early bipolar disorder. Frontiers in Psychiatry, 10, 140.
[59] Dhillon, P. S., Talukdar, P., & Crammer, K. (2010). Learning better data representation using inference-driven metric learning. Proceedings of the Acl 2010 Conference Short Papers, 377-381.
[60] Dhingra, N., Ritter, F., & Kunz, A. (2021). BGT-Net: Bidirectional GRU transformer network for scene graph generation. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2150-2159.
[61] Ding, M., Kong, K., Li, J., Zhu, C., Dickerson, J., Huang, F., & Goldstein, T. (2021). VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using Vector Quantization. Advances in Neural Information Processing Systems, 34, 6733-6746.
[62] Ding, Y., Yao, Q., Zhao, H., & Zhang, T. (2021). Diffmg: Differentiable meta graph search for heterogeneous graph neural networks. Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 279-288.
[63] Dong, Y., Chawla, N. V, & Swami, A. (2017). metapath2vec: Scalable representation learning for heterogeneous networks. Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 135-144.
[64] Dong, Y., Ding, K., Jalaian, B., Ji, S., & Li, J. (2021). AdaGNN: Graph Neural Networks with Adaptive Frequency Response Filter. Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 392-401.
[65] Du, L., Wang, Y., Song, G., Lu, Z., & Wang, J. (2018). Dynamic Network Embedding: An Extended Approach for Skip-gram based Network Embedding. IJCAI, 2018, 2086-2092.
[66] Eliasof, M., Haber, E., & Treister, E. (2021). Pde-gcn: Novel architectures for graph neural networks motivated by partial differential equations. Advances in Neural Information Processing Systems, 34, 3836-3849.
[67] Fang, S., Pan, X., Xiang, S., & Pan, C. (2020). Meta-msnet: Meta-learning based multi-source data fusion for traffic flow prediction. IEEE Signal Processing Letters, 28, 6-10.
[68] Fang, X., Huang, J., Wang, F., Zeng, L., Liang, H., & Wang, H. (2020). Constgat: Contextual spatial-temporal graph attention network for travel time estimation at baidu maps. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2697-2705.
[69] Fey, M., Lenssen, J. E., Weichert, F., & Leskovec, J. (2021). Gnnautoscale: Scalable and expressive graph neural networks via historical embeddings. International Conference on Machine Learning, 3294-3304.
[70] Fornito, A., Zalesky, A., & Breakspear, M. (2013). Graph analysis of the human connectome: promise, progress, and pitfalls. Neuroimage, 80, 426-444.
[71] Fouss, F., Pirotte, A., Renders, J.-M., & Saerens, M. (2007). Random-walk computation of similarities between nodes of a graph with application to collaborative recommendation. IEEE Transactions on Knowledge and Data Engineering, 19(3), 355-369.
[72] Gao, H., Chen, Y., & Ji, S. (2019). Learning graph pooling and hybrid convolutional operations for text representations. The World Wide Web Conference, 2743-2749.
[73] Gao, Y., Yang, H., Zhang, P., Zhou, C., & Hu, Y. (2019). Graphnas: Graph neural architecture search with reinforcement learning. ArXiv Preprint ArXiv:1904.09981.
[74] Gao, Y., Zhang, P., Yang, H., Zhou, C., Tian, Z., Hu, Y., Li, Z., & Zhou, J. (2022). GraphNAS++: Distributed Architecture Search for Graph Neural Networks. IEEE Transactions on Knowledge and Data Engineering.
[75] Ge, L., Li, S., Wang, Y., Chang, F., & Wu, K. (2020). Global spatial-temporal graph convolutional network for urban traffic speed prediction. Applied Sciences, 10(4), 1509.
[76] Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O., & Dahl, G. E. (2017). Neural message passing for quantum chemistry. International Conference on Machine Learning, 1263-1272.
[77] Gomez, A. N., Ren, M., Urtasun, R., & Grosse, R. B. (2017). The reversible residual network: Backpropagation without storing activations. Advances in Neural Information Processing Systems, 30.
[78] Gong, C., Tao, D., Yang, J., & Fu, K. (2014). Signed laplacian embedding for supervised dimension reduction. Proceedings of the AAAI Conference on Artificial Intelligence, 28(1).
[79] Gou, J., Yang, Y., Yi, Z., Lv, J., Mao, Q., & Zhan, Y. (2020). Discriminative globality and locality preserving graph embedding for dimensionality reduction. Expert Systems with Applications, 144, 113079.
[80] Goyal, P., & Ferrara, E. (2018). Graph embedding techniques, applications, and performance: A survey. Knowledge-Based Systems, 151, 78-94.
[81] Grover, A., & Leskovec, J. (2016). node2vec: Scalable feature learning for networks. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 855-864.
[82] Guan, R., Liu, Y., Feng, X., & Li, X. (2021). VPALG: Paper-publication Prediction with Graph Neural Networks. Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 617-626.
[83] Guo, Z., Zhang, X., Mu, H., Heng, W., Liu, Z., Wei, Y., & Sun, J. (2020). Single path one-shot neural architecture search with uniform sampling. European Conference on Computer Vision, 544-560.
[84] Gurwitz, D. (2020). Repurposing current therapeutics for treating COVID-19: A vital role of prescription records data mining. Drug Development Research, 81(7), 777-781.
[85] Hamilton, W. L., Ying, R., & Leskovec, J. (2017). Representation learning on graphs: Methods and applications. ArXiv Preprint ArXiv:1709.05584.
[86] Hamilton, W., Ying, Z., & Leskovec, J. (2017). Inductive representation learning on large graphs. Advances in Neural Information Processing Systems, 30.
[87] Hasanzadeh, A., Hajiramezanali, E., Boluki, S., Zhou, M., Duffield, N., Narayanan, K., & Qian, X. (2020). Bayesian graph neural networks with adaptive connection sampling. International Conference on Machine Learning, 4094-4104.
[88] Hassani, K., & Khasahmadi, A. H. (2020). Contrastive multi-view representation learning on graphs. International Conference on Machine Learning, 4116-4126.
[89] He, D., Guo, R., Wang, X., Jin, D., Huang, Y., & Wang, W. (2022). Inflation Improves Graph Neural Networks. Proceedings of the ACM Web Conference 2022, 1466-1474.
[90] He, H., Ye, K., & Xu, C.-Z. (2021). Multi-feature Urban Traffic Prediction Based on Unconstrained Graph Attention Network. 2021 IEEE International Conference on Big Data (Big Data), 1409-1417.
[91] He, K., Fan, H., Wu, Y., Xie, S., & Girshick, R. (2020). Momentum contrast for unsupervised visual representation learning. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 9729-9738.
[92] Hjelm, R. D., Fedorov, A., Lavoie-Marchildon, S., Grewal, K., Bachman, P., Trischler, A., & Bengio, Y. (2018). Learning deep representations by mutual information estimation and maximization. ArXiv Preprint ArXiv:1808.06670.
[93] Hofman, J. M., Watts, D. J., Athey, S., Garip, F., Griffiths, T. L., Kleinberg, J., Margetts, H., Mullainathan, S., Salganik, M. J., Vazire, S., & others. (2021). Integrating explanation and prediction in computational social science. Nature, 595(7866), 181-188.
[94] Hong, D., Gao, L., Yao, J., Zhang, B., Plaza, A., & Chanussot, J. (2020). Graph convolutional networks for hyperspectral image classification. IEEE Transactions on Geoscience and Remote Sensing, 59(7), 5966-5978.
[95] Hu, C., Cheng, L., Sepulcre, J., Johnson, K. A., Fakhri, G. E., Lu, Y. M., & Li, Q. (2015). A spectral graph regression model for learning brain connectivity of Alzheimer’s disease. PloS One, 10(5), e0128136.
[96] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., & Leskovec, J. (2019). Strategies for pre-training graph neural networks. ArXiv Preprint ArXiv:1905.12265.
[97] Hu, Z., Dong, Y., Wang, K., Chang, K.-W., & Sun, Y. (2020). Gpt-gnn: Generative pre-training of graph neural networks. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 1857-1867.
[98] Hu, Z., Fan, C., Chen, T., Chang, K.-W., & Sun, Y. (2019). Pre-training graph neural networks for generic structural feature extraction. ArXiv Preprint ArXiv:1905.13728.
[99] Huang, C., Xu, H., Xu, Y., Dai, P., Xia, L., Lu, M., Bo, L., Xing, H., Lai, X., & Ye, Y. (2021). Knowledge-aware coupled graph neural network for social recommendation. Proceedings of the AAAI Conference on Artificial Intelligence, 35(5), 4115-4122.
[100] Huang, Q., Yamada, M., Tian, Y., Singh, D., & Chang, Y. (2022). Graphlime: Local interpretable model explanations for graph neural networks. IEEE Transactions on Knowledge and Data Engineering.
[101] Huang, Z., Wang, Y., Li, C., & He, H. (2022). Going Deeper into Permutation-Sensitive Graph Neural Networks. ArXiv Preprint ArXiv:2205.14368.
[102] Huang, Z., Zhang, S., Xi, C., Liu, T., & Zhou, M. (2021). Scaling up graph neural networks via graph coarsening. Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 675-684.
[103] Hurle, M. R., Yang, L., Xie, Q., Rajpal, D. K., Sanseau, P., & Agarwal, P. (2013). Computational drug repositioning: from data to therapeutics. Clinical Pharmacology & Therapeutics, 93(4), 335-341.
[104] Hussain, M. S., Zaki, M. J., & Subramanian, D. (2022). Global self-attention as a replacement for graph convolution. Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 655-665.
[105] Inuwa-Dutse, I., Liptrott, M., & Korkontzelos, I. (2021). A multilevel clustering technique for community detection. Neurocomputing, 441, 64-78.
[106] James, J. Q. (2021). Citywide Estimation of Travel Time Distributions with Bayesian Deep Graph Learning. IEEE Transactions on Knowledge and Data Engineering.
[107] Jiang, W., & Luo, J. (2022). Graph neural network for traffic forecasting: A survey. Expert Systems with Applications, 117921.
[108] Jiao, Y., Xiong, Y., Zhang, J., Zhang, Y., Zhang, T., & Zhu, Y. (2020). Sub-graph contrast for scalable self-supervised graph representation learning. 2020 IEEE International Conference on Data Mining (ICDM), 222-231.
[109] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., & Tang, J. (2020). Self-supervised learning on graphs: Deep insights and new direction. ArXiv Preprint ArXiv:2006.10141.
[110] Jin, W., Liu, X., Zhao, X., Ma, Y., Shah, N., & Tang, J. (2021). Automated self-supervised learning for graphs. ArXiv Preprint ArXiv:2106.05470.
[111] Kermarrec, A.-M., Leroy, V., & Trédan, G. (2011). Distributed social graph embedding. Proceedings of the 20th ACM International Conference on Information and Knowledge Management, 1209-1214.
[112] Khoshraftar, S., & An, A. (2022). A survey on graph representation learning methods. ArXiv Preprint ArXiv:2204.01855.
[113] Kim, D., & Oh, A. (2022). How to find your friendly neighborhood: Graph attention design with self-supervision. ArXiv Preprint ArXiv:2204.04879.
[114] Kipf, T. N., & Welling, M. (2016a). Semi-supervised classification with graph convolutional networks. ArXiv Preprint ArXiv:1609.02907.
[115] Kipf, T. N., & Welling, M. (2016b). Variational graph auto-encoders. ArXiv Preprint ArXiv:1611.07308.
[116] Klein, A., Falkner, S., Springenberg, J. T., & Hutter, F. (2016). Learning curve prediction with Bayesian neural networks.
[117] Klicpera, J., Bojchevski, A., & Günnemann, S. (2018). Predict then propagate: Combining neural networks with personalized pagerank for classification on graphs. International Conference on Learning Representations.
[118] Kondor, R., Shervashidze, N., & Borgwardt, K. M. (2009). The graphlet spectrum. Proceedings of the 26th Annual International Conference on Machine Learning, 529-536.
[119] Kriege, N. M., Johansson, F. D., & Morris, C. (2020). A survey on graph kernels. Appl Netw Sci 5 (1): 6.
[120] Lai, K.-H., Zha, D., Zhou, K., & Hu, X. (2020). Policy-gnn: Aggregation optimization for graph neural networks. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 461-471.
[121] Lee, S. Y., Bu, F., Yoo, J., & Shin, K. (2023). Towards Deep Attention in Graph Neural Networks: Problems and Remedies. ArXiv Preprint ArXiv:2306.02376.
[122] Leow, Y. Y., Laurent, T., & Bresson, X. (2019). GraphTSNE: a visualization technique for graph-structured data. ArXiv Preprint ArXiv:1904.06915.
[123] Levie, R., Monti, F., Bresson, X., & Bronstein, M. M. (2018). Cayleynets: Graph convolutional neural networks with complex rational spectral filters. IEEE Transactions on Signal Processing, 67(1), 97-109. · Zbl 1415.68169
[124] Li, B., & Pi, D. (2020). Network representation learning: a systematic literature review. Neural Computing and Applications, 32(21), 16647-16679.
[125] Li, C., Yan, Y., Fu, J., Zhao, Z., & Zeng, Q. (2023). HetReGAT-FC: Heterogeneous Residual Graph Attention Network via Feature Completion. Information Sciences, 632, 424-438.
[126] Li, F., Feng, J., Yan, H., Jin, G., Yang, F., Sun, F., Jin, D., & Li, Y. (2021). Dynamic graph convolutional recurrent network for traffic prediction: Benchmark and solution. ACM Transactions on Knowledge Discovery from Data (TKDD).
[127] Li, G., Müller, M., Ghanem, B., & Koltun, V. (2021). Training graph neural networks with 1000 layers. International Conference on Machine Learning, 6437-6449.
[128] Li, I., Yan, V., Li, T., Qu, R., & Radev, D. (2021). Unsupervised cross-domain prerequisite chain learning using variational graph autoencoders. ArXiv Preprint ArXiv:2105.03505.
[129] Li, J., Rong, Y., Cheng, H., Meng, H., Huang, W., & Huang, J. (2019). Semi-supervised graph classification: A hierarchical graph perspective. The World Wide Web Conference, 972-982.
[130] Li, L., Gan, Z., Cheng, Y., & Liu, J. (2019). Relation-aware graph attention network for visual question answering. Proceedings of the IEEE/CVF International Conference on Computer Vision, 10313-10322.
[131] Li, P., Wang, Y., Wang, H., & Leskovec, J. (2020). Distance encoding: Design provably more powerful neural networks for graph representation learning. Advances in Neural Information Processing Systems, 33, 4465-4478.
[132] Li, Q., Han, Z., & Wu, X.-M. (2018). Deeper insights into graph convolutional networks for semi-supervised learning. Thirty-Second AAAI Conference on Artificial Intelligence.
[133] Li, S., Xu, F., Wang, R., & Zhong, S. (2021). Self-supervised incremental deep graph learning for ethereum phishing scam detection. ArXiv Preprint ArXiv:2106.10176.
[134] Li, Y., & King, I. (2020). Autograph: Automated graph neural network. International Conference on Neural Information Processing, 189-201.
[135] Li, Y., Wen, Z., Wang, Y., & Xu, C. (2021). One-shot graph neural architecture search with dynamic search space. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 8510-8517.
[136] Li, Y., Yu, R., Shahabi, C., & Liu, Y. (2017). Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. ArXiv Preprint ArXiv:1707.01926.
[137] Lian, D., Zhu, Z., Zheng, K., Ge, Y., Xie, X., & Chen, E. (2022). Network Representation Lightening from Hashing to Quantization. IEEE Transactions on Knowledge and Data Engineering, 35(5), 5119-5131.
[138] Lin, J., Cai, Q., & Lin, M. (2021). Multi-label classification of fundus images with graph convolutional network and self-supervised learning. IEEE Signal Processing Letters, 28, 454-458.
[139] Lin, Q., Zhu, F.-Y., Shu, Y.-Q., Zhu, P.-W., Ye, L., Shi, W.-Q., Min, Y.-L., Li, B., Yuan, Q., & Shao, Y. (2021). Altered brain network centrality in middle-aged patients with retinitis pigmentosa: A resting-state functional magnetic resonance imaging study. Brain and Behavior, 11(2), e01983.
[140] Liu, H. X., Simonyan, K., & Yang, Y. M. (2019). DARTS: Differentiable architecture search, presented at the 7th Int. Conf. Learning Representations, New Orleans, LA, USA, 9055-9067.
[141] Liu, H., Yang, Y., & Wang, X. (2021). Overcoming catastrophic forgetting in graph neural networks. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 8653-8661.
[142] Liu, M., Gao, H., & Ji, S. (2020). Towards deeper graph neural networks. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 338-348.
[143] Liu, R., Hirn, M., & Krishnan, A. (2023). Accurately modeling biased random walks on weighted networks using node2vec+. Bioinformatics, 39(1), btad047.
[144] Liu, R., Nejati, H., & Cheung, N.-M. (2018). Joint estimation of low-rank components and connectivity graph in high-dimensional graph signals: application to brain imaging. ArXiv Preprint ArXiv:1801.02303.
[145] Liu, X., Luo, Z., & Huang, H. (2018). Jointly multiple events extraction via attention-based graph information aggregation. ArXiv Preprint ArXiv:1809.09078.
[146] Liu, X., Yan, M., Deng, L., Li, G., Ye, X., & Fan, D. (2021). Sampling methods for efficient training of graph convolutional networks: A survey. IEEE/CAA Journal of Automatica Sinica, 9(2), 205-234.
[147] Liu, X., Zhang, F., Hou, Z., Mian, L., Wang, Z., Zhang, J., & Tang, J. (2021). Self-supervised learning: Generative or contrastive. IEEE Transactions on Knowledge and Data Engineering.
[148] Liu, Y., Jin, M., Pan, S., Zhou, C., Zheng, Y., Xia, F., & Yu, P. (2022). Graph self-supervised learning: A survey. IEEE Transactions on Knowledge and Data Engineering.
[149] Liu, Z., Chen, C., Li, L., Zhou, J., Li, X., Song, L., & Qi, Y. (2019). Geniepath: Graph neural networks with adaptive receptive paths. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4424-4431.
[150] Liu, Z., Wang, Y., Bernard, J., & Munzner, T. (2022). Visualizing graph neural networks with corgie: Corresponding a graph to its embedding. IEEE Transactions on Visualization and Computer Graphics, 28(6), 2500-2516.
[151] Luo, R., Liao, W., Huang, X., Pi, Y., & Philips, W. (2016). Feature extraction of hyperspectral images with semisupervised graph learning. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 9(9), 4389-4399.
[152] Ma, L., Rabbany, R., & Romero-Soriano, A. (2021). Graph attention networks with positional embeddings. Pacific-Asia Conference on Knowledge Discovery and Data Mining, 514-527.
[153] Manessi, F., & Rozza, A. (2021). Graph-based neural network models with multiple self-supervised auxiliary tasks. Pattern Recognition Letters, 148, 15-21.
[154] Mao, K., Zhu, J., Xiao, X., Lu, B., Wang, Z., & He, X. (2021). UltraGCN: ultra simplification of graph convolutional networks for recommendation. Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 1253-1262.
[155] Marcheggiani, D., Bastings, J., & Titov, I. (2018). Exploiting semantics in neural machine translation with graph convolutional networks. ArXiv Preprint ArXiv:1804.08313.
[156] Marcheggiani, D., & Titov, I. (2017). Encoding sentences with graph convolutional networks for semantic role labeling. ArXiv Preprint ArXiv:1703.04826.
[157] Maron, H., Ben-Hamu, H., Serviansky, H., & Lipman, Y. (2019). Provably powerful graph networks. Advances in Neural Information Processing Systems, 32.
[158] McPherson, M., Smith-Lovin, L., & Cook, J. M. (2001). Birds of a feather: Homophily in social networks. Annual Review of Sociology, 415-444.
[159] Micheli, A. (2009). Neural network for graphs: A contextual constructive approach. IEEE Transactions on Neural Networks, 20(3), 498-511.
[160] Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., & Dean, J. (2013). Distributed representations of words and phrases and their compositionality. Advances in Neural Information Processing Systems, 26.
[161] Mitrovic, S., & De Weerdt, J. (2019). Dyn2Vec: Exploiting dynamic behaviour using difference networks-based node embeddings for classification. Proceedings of the International Conference on Data Science, 194-200.
[162] Mokou, M., Lygirou, V., Angelioudaki, I., Paschalidis, N., Stroggilos, R., Frantzi, M., Latosinska, A., Bamias, A., Hoffmann, M. J., Mischak, H., & others. (2020). A novel pipeline for drug repurposing for bladder cancer based on patients’ omics signatures. Cancers, 12(12), 3519.
[163] Monti, F., Boscaini, D., Masci, J., Rodola, E., Svoboda, J., & Bronstein, M. M. (2017). Geometric deep learning on graphs and manifolds using mixture model cnns. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 5115-5124.
[164] Morris, C., Ritzert, M., Fey, M., Hamilton, W. L., Lenssen, J. E., Rattan, G., & Grohe, M. (2019). Weisfeiler and leman go neural: Higher-order graph neural networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4602-4609.
[165] Murphy, R., Srinivasan, B., Rao, V., & Ribeiro, B. (2019). Relational pooling for graph representations. International Conference on Machine Learning, 4663-4673.
[166] Newman, M. E. J. (2005). A measure of betweenness centrality based on random walks. Social Networks, 27(1), 39-54.
[167] Nguyen, T., & Grishman, R. (2018). Graph convolutional networks with argument-aware pooling for event detection. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1).
[168] Nickel, M., Tresp, V., & Kriegel, H.-P. (2011). A three-way model for collective learning on multi-relational data. Icml.
[169] Niepert, M., Ahmed, M., & Kutzkov, K. (2016). Learning convolutional neural networks for graphs. International Conference on Machine Learning, 2014-2023.
[170] Nikolentzos, G., Siglidis, G., & Vazirgiannis, M. (2021). Graph kernels: A survey. Journal of Artificial Intelligence Research, 72, 943-1027. · Zbl 1522.68477
[171] Noy, A., Nayman, N., Ridnik, T., Zamir, N., Doveh, S., Friedman, I., Giryes, R., & Zelnik, L. (2020). Asap: Architecture search, anneal and prune. International Conference on Artificial Intelligence and Statistics, 493-503.
[172] Oellermann, O. R., & Schwenk, A. J. (1991). The Laplacian spectrum of graphs. Graph Theory, c, Appl, 2, 871-898. · Zbl 0840.05059
[173] Okuda, M., Satoh, S., Sato, Y., & Kidawara, Y. (2021). Unsupervised common particular object discovery and localization by analyzing a match graph. ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 1540-1544.
[174] Oloulade, B. M., Gao, J., Chen, J., Lyu, T., & Al-Sabri, R. (2021). Graph neural architecture search: A survey. Tsinghua Science and Technology, 27(4), 692-708.
[175] Oono, K., & Suzuki, T. (2019). Graph neural networks exponentially lose expressive power for node classification. ArXiv Preprint ArXiv:1905.10947.
[176] Oord, A. van den, Li, Y., & Vinyals, O. (2018). Representation learning with contrastive predictive coding. ArXiv Preprint ArXiv:1807.03748.
[177] Opolka, F. L., Solomon, A., Cangea, C., Veličković, P., Liò, P., & Hjelm, R. D. (2019). Spatio-temporal deep graph infomax. ArXiv Preprint ArXiv:1904.06316.
[178] Ou, M., Cui, P., Pei, J., Zhang, Z., & Zhu, W. (2016). Asymmetric transitivity preserving graph embedding. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 1105-1114.
[179] Ozaki, K., Shimbo, M., Komachi, M., & Matsumoto, Y. (2011). Using the mutual k-nearest neighbor graphs for semi-supervised classification on natural language data. Proceedings of the Fifteenth Conference on Computational Natural Language Learning, 154-162.
[180] Page, L., Brin, S., Motwani, R., & Winograd, T. (1999). The PageRank citation ranking: Bringing order to the web.
[181] Pan, S., Hu, R., Fung, S., Long, G., Jiang, J., & Zhang, C. (2019). Learning graph embedding with adversarial training methods. IEEE Transactions on Cybernetics, 50(6), 2475-2487.
[182] Pan, S., Hu, R., Long, G., Jiang, J., Yao, L., & Zhang, C. (2018). Adversarially regularized graph autoencoder for graph embedding. ArXiv Preprint ArXiv:1802.04407.
[183] Pang, Y., Wu, L., Shen, Q., Zhang, Y., Wei, Z., Xu, F., Chang, E., Long, B., & Pei, J. (2022). Heterogeneous global graph neural networks for personalized session-based recommendation. Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining, 775-783.
[184] Papp, P. A., Martinkus, K., Faber, L., & Wattenhofer, R. (2021). Dropgnn: random dropouts increase the expressiveness of graph neural networks. Advances in Neural Information Processing Systems, 34, 21997-22009.
[185] Park, C., Kim, D., Han, J., & Yu, H. (2020). Unsupervised attributed multiplex network embedding. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5371-5378.
[186] Park, J., Lee, M., Chang, H. J., Lee, K., & Choi, J. Y. (2019). Symmetric graph convolutional autoencoder for unsupervised graph representation learning. Proceedings of the IEEE/CVF International Conference on Computer Vision, 6519-6528.
[187] Peng, H., Li, J., He, Y., Liu, Y., Bao, M., Wang, L., Song, Y., & Yang, Q. (2018). Large-scale hierarchical text classification with recursively regularized deep graph-cnn. Proceedings of the 2018 World Wide Web Conference, 1063-1072.
[188] Peng, H., Wang, H., Du, B., Bhuiyan, M. Z. A., Ma, H., Liu, J., Wang, L., Yang, Z., Du, L., Wang, S., & others. (2020). Spatial temporal incidence dynamic graph neural networks for traffic flow forecasting. Information Sciences, 521, 277-290.
[189] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., & Zheng, Q. (2020). Self-supervised graph representation learning via global context prediction. ArXiv Preprint ArXiv:2003.01604.
[190] Peng, Z., Huang, W., Luo, M., Zheng, Q., Rong, Y., Xu, T., & Huang, J. (2020). Graph representation learning via graphical mutual information maximization. Proceedings of The Web Conference 2020, 259-270.
[191] Perozzi, B., Al-Rfou, R., & Skiena, S. (2014). Deepwalk: Online learning of social representations. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 701-710.
[192] Perozzi, B., Kulkarni, V., Chen, H., & Skiena, S. (2017). Don’t walk, skip! online learning of multi-scale network embeddings. Proceedings of the 2017 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining 2017, 258-265.
[193] Pham, H., Guan, M., Zoph, B., Le, Q., & Dean, J. (2018). Efficient neural architecture search via parameters sharing. International Conference on Machine Learning, 4095-4104.
[194] Prakash, V. J., & Nithya, D. L. M. (2014). A survey on semi-supervised learning techniques. ArXiv Preprint ArXiv:1402.4645.
[195] Pushpakom, S., Iorio, F., Eyers, P. A., Escott, K. J., Hopper, S., Wells, A., Doig, A., Guilliams, T., Latimer, J., McNamee, C., & others. (2019). Drug repurposing: progress, challenges and recommendations. Nature Reviews Drug Discovery, 18(1), 41-58.
[196] Qian, Y., Santus, E., Jin, Z., Guo, J., & Barzilay, R. (2018). GraphIE: A graph-based framework for information extraction. ArXiv Preprint ArXiv:1810.13083.
[197] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., & Tang, J. (2020). Gcc: Graph contrastive coding for graph neural network pre-training. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 1150-1160.
[198] Rahman, M., Saha, T. K., Hasan, M. Al, Xu, K. S., & Reddy, C. K. (2018). Dylink2vec: Effective feature representation for link prediction in dynamic networks. ArXiv Preprint ArXiv:1804.05755.
[199] Ranjan, E., Sanyal, S., & Talukdar, P. (2020). Asap: Adaptive structure aware pooling for learning hierarchical graph representations. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5470-5477.
[200] Ren, Y., Liu, B., Huang, C., Dai, P., Bo, L., & Zhang, J. (2020). HDGI: An Unsupervised Graph Neural Network for Representation Learning in Heterogeneous Graph. AAAI Workshop.
[201] Ribeiro, L. F. R., Saverese, P. H. P., & Figueiredo, D. R. (2017). struc2vec: Learning node representations from structural identity. Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 385-394.
[202] Ringsquandl, M., Sellami, H., Hildebrandt, M., Beyer, D., Henselmeyer, S., Weber, S., & Joblin, M. (2021). Power to the Relational Inductive Bias: Graph Neural Networks in Electrical Power Grids. Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 1538-1547.
[203] Rong, Y., Bian, Y., Xu, T., Xie, W., Wei, Y., Huang, W., & Huang, J. (2020). Self-supervised graph transformer on large-scale molecular data. Advances in Neural Information Processing Systems, 33, 12559-12571.
[204] Rong, Y., Huang, W., Xu, T., & Huang, J. (2019). Dropedge: Towards deep graph convolutional networks on node classification. ArXiv Preprint ArXiv:1907.10903.
[205] Roweis, S. T., & Saul, L. K. (2000). Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500), 2323-2326.
[206] Safavi, T., & Koutra, D. (2020). Codex: A comprehensive knowledge graph completion benchmark. ArXiv Preprint ArXiv:2009.07810.
[207] Sato, R., Yamada, M., & Kashima, H. (2021). Random features strengthen graph neural networks. Proceedings of the 2021 SIAM International Conference on Data Mining (SDM), 333-341.
[208] Satorras, V. G., & Estrach, J. B. (2018). Few-Shot Learning with Graph Neural Networks. International Conference on Learning Representations.
[209] Schnake, T., Eberle, O., Lederer, J., Nakajima, S., Schütt, K. T., Müller, K.-R., & Montavon, G. (2021). Higher-order explanations of graph neural networks via relevant walks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(11), 7581-7596.
[210] Selvaraju, R. R., Lee, S., Shen, Y., Jin, H., Ghosh, S., Heck, L., Batra, D., & Parikh, D. (2019). Taking a hint: Leveraging explanations to make vision and language models more grounded. Proceedings of the IEEE/CVF International Conference on Computer Vision, 2591-2600.
[211] Shen, X.-J., Liu, S.-X., Bao, B.-K., Pan, C.-H., Zha, Z.-J., & Fan, J. (2020). A generalized least-squares approach regularized with graph embedding for dimensionality reduction. Pattern Recognition, 98, 107023.
[212] Shervashidze, N., Vishwanathan, S. V. N., Petri, T., Mehlhorn, K., & Borgwardt, K. (2009). Efficient graphlet kernels for large graph comparison. Artificial Intelligence and Statistics, 488-495.
[213] Shi, M., Wilson, D. A., Zhu, X., Huang, Y., Zhuang, Y., Liu, J., & Tang, Y. (2020). Evolutionary architecture search for graph neural networks. ArXiv Preprint ArXiv:2009.10199.
[214] Shi, X., Lv, F., Seng, D., Zhang, J., Chen, J., & Xing, B. (2021). Visualizing and understanding graph convolutional network. Multimedia Tools and Applications, 80, 8355-8375.
[215] Shuman, D. I., Narang, S. K., Frossard, P., Ortega, A., & Vandergheynst, P. (2013). The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE Signal Processing Magazine, 30(3), 83-98.
[216] Si, S., Wang, B., Liu, X., Yu, C., Ding, C., & Zhao, H. (2019). Brain network modeling based on mutual information and graph theory for predicting the connection mechanism in the progression of Alzheimer’s disease. Entropy, 21(3), 300.
[217] Subramonian, A. (2021). Motif-driven contrastive learning of graph representations. Proceedings of the AAAI Conference on Artificial Intelligence, 35(18), 15980-15981.
[218] Sun, F.-Y., Hoffmann, J., Verma, V., & Tang, J. (2019). Infograph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization. ArXiv Preprint ArXiv:1908.01000.
[219] Sun, K., Lin, Z., & Zhu, Z. (2020). Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5892-5899.
[220] Sun, Q., Li, J., Peng, H., Wu, J., Ning, Y., Yu, P. S., & He, L. (2021). Sugar: Subgraph neural network with reinforcement pooling and self-supervised mutual information mechanism. Proceedings of the Web Conference 2021, 2081-2091.
[221] Sun, Z., Deng, Z.-H., Nie, J.-Y., & Tang, J. (2019). Rotate: Knowledge graph embedding by relational rotation in complex space. ArXiv Preprint ArXiv:1902.10197.
[222] Taheri, A., Gimpel, K., & Berger-Wolf, T. (2019). Learning to represent the evolution of dynamic graphs with recurrent models. Companion Proceedings of the 2019 World Wide Web Conference, 301-307.
[223] Tan, Q., Liu, N., & Hu, X. (2019). Deep representation learning for social network analysis. Frontiers in Big Data, 2, 2.
[224] Tang, J., Qu, M., & Mei, Q. (2015). Pte: Predictive text embedding through large-scale heterogeneous text networks. Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 1165-1174.
[225] Tang, J., Qu, M., Wang, M., Zhang, M., Yan, J., & Mei, Q. (2015). Line: Large-scale information network embedding. Proceedings of the 24th International Conference on World Wide Web, 1067-1077.
[226] Togninalli, M., Ghisu, E., Llinares-López, F., Rieck, B., & Borgwardt, K. (2019). Wasserstein weisfeiler-lehman graph kernels. Advances in Neural Information Processing Systems, 32.
[227] Topping, J., Di Giovanni, F., Chamberlain, B. P., Dong, X., & Bronstein, M. M. (2021). Understanding over-squashing and bottlenecks on graphs via curvature. ArXiv Preprint ArXiv:2111.14522.
[228] Trouillon, T., Welbl, J., Riedel, S., Gaussier, É., & Bouchard, G. (2016). Complex embeddings for simple link prediction. International Conference on Machine Learning, 2071-2080.
[229] Tu, K., Cui, P., Wang, X., Wang, F., & Zhu, W. (2018). Structural deep embedding for hyper-networks. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1).
[230] Tu, K., Cui, P., Wang, X., Yu, P. S., & Zhu, W. (2018). Deep recursive network embedding with regular equivalence. Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2357-2366.
[231] Tu, K., Ma, J., Cui, P., Pei, J., & Zhu, W. (2019). Autone: Hyperparameter optimization for massive network embedding. Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 216-225.
[232] Urry, M. J., & Sollich, P. (2013). Random walk kernels and learning curves for gaussian process regression on random graphs. The Journal of Machine Learning Research, 14(1), 1801-1835. · Zbl 1318.62139
[233] Van Engelen, J. E., & Hoos, H. H. (2020). A survey on semi-supervised learning. Machine Learning, 109(2), 373-440. · Zbl 1441.68215
[234] Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., & Bengio, Y. (2017). Graph attention networks. Stat, 1050, 20.
[235] Velickovic, P., Fedus, W., Hamilton, W. L., Liò, P., Bengio, Y., & Hjelm, R. D. (2019). Deep Graph Infomax. ICLR (Poster), 2(3), 4.
[236] Wan, S., Pan, S., Yang, J., & Gong, C. (2021). Contrastive and generative graph convolutional networks for graph-based semi-supervised learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 10049-10057.
[237] Wang, C., Pan, S., Long, G., Zhu, X., & Jiang, J. (2017). Mgae: Marginalized graph autoencoder for graph clustering. Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, 889-898.
[238] Wang, C., Wang, C., Wang, Z., Ye, X., & Yu, P. S. (2020). Edge2vec: Edge-based social network embedding. ACM Transactions on Knowledge Discovery from Data (TKDD), 14(4), 1-24.
[239] Wang, D., Cui, P., & Zhu, W. (2016). Structural deep network embedding. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 1225-1234.
[240] Wang, H., Yin, H., Zhang, M., & Li, P. (2022). Equivariant and stable positional encoding for more powerful graph neural networks. ArXiv Preprint ArXiv:2203.00199.
[241] Wang, H., Zhang, F., Zhang, M., Leskovec, J., Zhao, M., Li, W., & Wang, Z. (2019). Knowledge-aware graph neural networks with label smoothness regularization for recommender systems. Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 968-977.
[242] Wang, P., Agarwal, K., Ham, C., Choudhury, S., & Reddy, C. K. (2021). Self-supervised learning of contextual embeddings for link prediction in heterogeneous networks. Proceedings of the Web Conference 2021, 2946-2957.
[243] Wang, P., Wu, Q., Cao, J., Shen, C., Gao, L., & Hengel, A. van den. (2019). Neighbourhood watch: Referring expression comprehension via language-guided graph attention networks. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 1960-1968.
[244] Wang, S., Wang, R., Yao, Z., Shan, S., & Chen, X. (2020). Cross-modal scene graph matching for relationship-aware image-text retrieval. Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 1508-1517.
[245] Wang, Z., Lin, G., Tan, H., Chen, Q., & Liu, X. (2020). CKAN: collaborative knowledge-aware attentive network for recommender systems. Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, 219-228.
[246] Weber, M., Chen, J., Suzumura, T., Pareja, A., Ma, T., Kanezashi, H., Kaler, T., Leiserson, C. E., & Schardl, T. B. (2018). Scalable graph learning for anti-money laundering: A first look. ArXiv Preprint ArXiv:1812.00076.
[247] Weisfeiler, B., & Leman, A. (1968). The reduction of a graph to canonical form and the algebra which appears therein. Nti, Series, 2(9), 12-16.
[248] Wen, W., Liu, H., Chen, Y., Li, H., Bender, G., & Kindermans, P.-J. (2020). Neural predictor for neural architecture search. European Conference on Computer Vision, 660-676.
[249] Weston, J., Ratle, F., & Collobert, R. (2008). Deep learning via semi-supervised embedding. Proceedings of the 25th International Conference on Machine Learning, 1168-1175.
[250] Wijesinghe, A., & Wang, Q. (2021). A New Perspective on“ How Graph Neural Networks Go Beyond Weisfeiler-Lehman?”. International Conference on Learning Representations.
[251] Wink, A. M., Tijms, B. M., Ten Kate, M., Raspor, E., de Munck, J. C., Altena, E., Ecay-Torres, M., Clerigue, M., Estanga, A., Garcia-Sebastian, M., & others. (2018). Functional brain network centrality is related to APOE genotype in cognitively normal elderly. Brain and Behavior, 8(9), e01080.
[252] Wu, F., Souza, A., Zhang, T., Fifty, C., Yu, T., & Weinberger, K. (2019). Simplifying graph convolutional networks. International Conference on Machine Learning, 6861-6871.
[253] Wu, J., Wang, X., Feng, F., He, X., Chen, L., Lian, J., & Xie, X. (2021). Self-supervised graph learning for recommendation. Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, 726-735.
[254] Wu, Y., Warner, J. L., Wang, L., Jiang, M., Xu, J., Chen, Q., Nian, H., Dai, Q., Du, X., Yang, P., & others. (2019). Discovery of noncancer drug effects on survival in electronic health records of patients with cancer: a new paradigm for drug repurposing. JCO Clinical Cancer Informatics, 3, 1-9.
[255] Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., & Philip, S. Y. (2020). A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems, 32(1), 4-24.
[256] Xie, Y., Li, S., Yang, C., Wong, R. C.-W., & Han, J. (2020). When do gnns work: Understanding and improving neighborhood aggregation. IJCAI’20: Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence,\(\{\) IJCAI \(\} 2020\), 2020(1).
[257] Xu, K., Hu, W., Leskovec, J., & Jegelka, S. (2018). How powerful are graph neural networks? ArXiv Preprint ArXiv:1810.00826.
[258] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K., & Jegelka, S. (2018). Representation learning on graphs with jumping knowledge networks. International Conference on Machine Learning, 5453-5462.
[259] Xu, K., Zhang, M., Jegelka, S., & Kawaguchi, K. (2021). Optimization of graph neural networks: Implicit acceleration by skip connections and more depth. International Conference on Machine Learning, 11592-11602.
[260] Xu, Q.-H., Li, Q.-Y., Yu, K., Ge, Q.-M., Shi, W.-Q., Li, B., Liang, R.-B., Lin, Q., Zhang, Y.-Q., & Shao, Y. (2020). Altered brain network centrality in patients with diabetic optic neuropathy: a resting-state FMRI study. Endocrine Practice, 26(12), 1399-1405.
[261] Yang, B., Yih, W., He, X., Gao, J., & Deng, L. (2014). Embedding entities and relations for learning and inference in knowledge bases. ArXiv Preprint ArXiv:1412.6575.
[262] Yang, R., Shi, J., Xiao, X., Yang, Y., & Bhowmick, S. S. (2019). Homogeneous network embedding for massive graphs via reweighted personalized pagerank. ArXiv Preprint ArXiv:1906.06826.
[263] Yang, X., Tang, K., Zhang, H., & Cai, J. (2019). Auto-encoding scene graphs for image captioning. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 10685-10694.
[264] Yang, Z., Cohen, W., & Salakhudinov, R. (2016). Revisiting semi-supervised learning with graph embeddings. International Conference on Machine Learning, 40-48.
[265] Yehudai, G., Fetaya, E., Meirom, E., Chechik, G., & Maron, H. (2021). From local structures to size generalization in graph neural networks. International Conference on Machine Learning, 11975-11986.
[266] Yin, X., Wu, G., Wei, J., Shen, Y., Qi, H., & Yin, B. (2021). Deep learning on traffic prediction: Methods, analysis and future directions. IEEE Transactions on Intelligent Transportation Systems.
[267] Yoon, M., Gervet, T., Hooi, B., & Faloutsos, C. (2020). Autonomous graph mining algorithm search with best speed/accuracy trade-off. 2020 IEEE International Conference on Data Mining (ICDM), 751-760.
[268] Yoon, M., Gervet, T., Hooi, B., & Faloutsos, C. (2022). Autonomous graph mining algorithm search with best performance trade-off. Knowledge and Information Systems, 1-32.
[269] Yoon, M., Gervet, T., Shi, B., Niu, S., He, Q., & Yang, J. (2021). Performance-Adaptive Sampling Strategy Towards Fast and Accurate Graph Neural Networks. Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2046-2056.
[270] You, J., Gomes-Selman, J. M., Ying, R., & Leskovec, J. (2021). Identity-aware graph neural networks. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10737-10745.
[271] You, J., Ying, Z., & Leskovec, J. (2020). Design space for graph neural networks. Advances in Neural Information Processing Systems, 33, 17009-17021.
[272] You, Y., Chen, T., Wang, Z., & Shen, Y. (2020). When does self-supervision help graph convolutional networks? International Conference on Machine Learning, 10871-10880.
[273] Yu, B., Yin, H., & Zhu, Z. (2017). Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. ArXiv Preprint ArXiv:1709.04875.
[274] Yu, J., Lin, Z., Yang, J., Shen, X., Lu, X., & Huang, T. S. (2018). Generative image inpainting with contextual attention. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 5505-5514.
[275] Yu, J., Lu, Y., Qin, Z., Zhang, W., Liu, Y., Tan, J., & Guo, L. (2018). Modeling text with graph convolutional network for cross-modal information retrieval. Pacific Rim Conference on Multimedia, 223-234.
[276] Yu, J., Xu, T., & He, R. (2021). Towards the explanation of graph neural networks in digital pathology with information flows. ArXiv Preprint ArXiv:2112.09895.
[277] Yun, S., Jeong, M., Kim, R., Kang, J., & Kim, H. J. (2019). Graph transformer networks. Advances in Neural Information Processing Systems, 32.
[278] Zeng, H., Zhang, M., Xia, Y., Srivastava, A., Malevich, A., Kannan, R., Prasanna, V., Jin, L., & Chen, R. (2021). Decoupling the depth and scope of graph neural networks. Advances in Neural Information Processing Systems, 34, 19665-19679.
[279] Zeng, H., Zhou, H., Srivastava, A., Kannan, R., & Prasanna, V. (2019). Graphsaint: Graph sampling based inductive learning method. ArXiv Preprint ArXiv:1907.04931.
[280] Zeng, J., & Xie, P. (2021). Contrastive self-supervised learning for graph classification. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10824-10832.
[281] Zhang, C., Zhang, K., Yuan, Q., Peng, H., Zheng, Y., Hanratty, T., Wang, S., & Han, J. (2017). Regions, periods, activities: Uncovering urban dynamics via cross-modal representation learning. Proceedings of the 26th International Conference on World Wide Web, 361-370.
[282] Zhang, D., Huang, X., Liu, Z., Hu, Z., Song, X., Ge, Z., Zhang, Z., Wang, L., Zhou, J., Shuang, Y., & others. (2020). Agl: a scalable system for industrial-purpose graph machine learning. ArXiv Preprint ArXiv:2003.02454.
[283] Zhang, D., Yin, J., Zhu, X., & Zhang, C. (2018). Network representation learning: A survey. IEEE Transactions on Big Data, 6(1), 3-28.
[284] Zhang, H., Lin, S., Liu, W., Zhou, P., Tang, J., Liang, X., & Xing, E. P. (2020). Iterative graph self-distillation. ArXiv Preprint ArXiv:2010.12609.
[285] Zhang, J., Kuo, A.-T., Zhao, J., Wen, Q., Winstanley, E., Zhang, C., & Ye, Y. (2021). RxNet: Rx-refill Graph Neural Network for Overprescribing Detection. Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2537-2546.
[286] Zhang, J., Shi, X., Xie, J., Ma, H., King, I., & Yeung, D.-Y. (2018). Gaan: Gated attention networks for learning on large and spatiotemporal graphs. ArXiv Preprint ArXiv:1803.07294.
[287] Zhang, J., Wang, Y., Yuan, Z., & Jin, Q. (2019). Personalized real-time movie recommendation system: Practical prototype and evaluation. Tsinghua Science and Technology, 25(2), 180-191.
[288] Zhang, J., Zhang, H., Xia, C., & Sun, L. (2020). Graph-bert: Only attention is needed for learning graph representations. ArXiv Preprint ArXiv:2001.05140.
[289] Zhang, M., & Li, P. (2021). Nested graph neural networks. Advances in Neural Information Processing Systems, 34, 15734-15747.
[290] Zhang, Y., Li, Y., Zhou, X., Liu, Z., & Luo, J. (2021). C 3-GAN: Complex-Condition-Controlled Urban Traffic Estimation through Generative Adversarial Networks. 2021 IEEE International Conference on Data Mining (ICDM), 1505-1510.
[291] Zhang, Y., Qi, P., & Manning, C. D. (2018). Graph convolution over pruned dependency trees improves relation extraction. ArXiv Preprint ArXiv:1809.10185.
[292] Zhang, Y., Ti\vno, P., Leonardis, A., & Tang, K. (2021). A survey on neural network interpretability. IEEE Transactions on Emerging Topics in Computational Intelligence, 5(5), 726-742.
[293] Zhang, Y., Wu, B., Liu, Y., & Lv, J. (2019). Local community detection based on network motifs. Tsinghua Science and Technology, 24(6), 716-727.
[294] Zhang, Z., Cui, P., Wang, X., Pei, J., Yao, X., & Zhu, W. (2018). Arbitrary-order proximity preserved network embedding. Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2778-2786.
[295] Zhang, Z., Cui, P., & Zhu, W. (2020). Deep learning on graphs: A survey. IEEE Transactions on Knowledge and Data Engineering, 34(1), 249-270.
[296] Zhao, H., Wei, L., & Yao, Q. (2020). Simplifying architecture search for graph neural network. ArXiv Preprint ArXiv:2008.11652.
[297] Zhao, H., Yao, Q., & Tu, W. (2021). Search to aggregate neighborhood for graph neural network. ArXiv Preprint ArXiv:2104.06608.
[298] Zhao, L., & Akoglu, L. (2019). Pairnorm: Tackling oversmoothing in gnns. ArXiv Preprint ArXiv:1909.12223.
[299] Zhao, L., Chen, M., Du, Y., Yang, H., & Wang, C. (2022). Spatial-Temporal Graph Convolutional Gated Recurrent Network for Traffic Forecasting. ArXiv Preprint ArXiv:2210.02737.
[300] Zhao, Y., Wang, D., Gao, X., Mullins, R., Lio, P., & Jamnik, M. (2020). Probabilistic dual network architecture search on graphs. ArXiv Preprint ArXiv:2003.09676.
[301] Zhao, Z., Zhang, X., Zhou, H., Li, C., Gong, M., & Wang, Y. (2020). HetNERec: Heterogeneous network embedding based recommendation. Knowledge-Based Systems, 204, 106218.
[302] Zheng, X., Ji, R., Wang, Q., Ye, Q., Li, Z., Tian, Y., & Tian, Q. (2020). Rethinking performance estimation in neural architecture search. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 11356-11365.
[303] Zhou, F., & Cao, C. (2021). Overcoming catastrophic forgetting in graph neural networks with experience replay. Proceedings of the AAAI Conference on Artificial Intelligence, 35(5), 4714-4722.
[304] Zhou, G., & Xia, J. (2018). OmicsNet: a web-based tool for creation and visual analysis of biological networks in 3D space. Nucleic Acids Research, 46(W1), W514-W522.
[305] Zhou, H., Yang, M., Wang, J., & Pan, W. (2019). Bayesnas: A bayesian approach for neural architecture search. International Conference on Machine Learning, 7603-7613.
[306] Zhou, J., Cui, G., Hu, S., Zhang, Z., Yang, C., Liu, Z., Wang, L., Li, C., & Sun, M. (2020). Graph neural networks: A review of methods and applications. AI Open, 1, 57-81.
[307] Zhou, K., Huang, X., Li, Y., Zha, D., Chen, R., & Hu, X. (2020). Towards deeper graph neural networks with differentiable group normalization. Advances in Neural Information Processing Systems, 33, 4917-4928.
[308] Zhou, K., Song, Q., Huang, X., & Hu, X. (2019). Auto-gnn: Neural architecture search of graph neural networks. ArXiv Preprint ArXiv:1909.03184.
[309] Zhou, Y., Zheng, H., Huang, X., Hao, S., Li, D., & Zhao, J. (2022). Graph neural networks: Taxonomy, advances, and trends. ACM Transactions on Intelligent Systems and Technology (TIST), 13(1), 1-54.
[310] Zhu, X., Ghahramani, Z., & Lafferty, J. D. (2003). Semi-supervised learning using gaussian fields and harmonic functions. Proceedings of the 20th International Conference on Machine Learning (ICML-03), 912-919.
[311] Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., & Wang, L. (2020). Deep graph contrastive representation learning. ArXiv Preprint ArXiv:2006.04131.
[312] Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., & Wang, L. (2021). Graph contrastive learning with adaptive augmentation. Proceedings of the Web Conference 2021, 2069-2080.
[313] Zhuang, C., & Ma, Q. (2018). Dual graph convolutional networks for graph-based semi-supervised classification. Proceedings of the 2018 World Wide Web Conference, 499-508.
[314] Zhuang, L., Zhou, Z., Gao, S., Yin, J., Lin, Z., & Ma, Y. (2017). Label information guided graph construction for semi-supervised learning. IEEE Transactions on Image Processing, 26(9), 4182-4192. · Zbl 1409.94801
[315] Zitouni, M. S., Sluzek, A., & Bhaskar, H. (2019). Visual analysis of socio-cognitive crowd behaviors for surveillance: A survey and categorization of trends and methods. Engineering Applications of Artificial Intelligence, 82, 294-312.
[316] Zuo, X.-N., Ehmke, R., Mennes, M., Imperati, D., Castellanos, F. X., Sporns, O., & Milham, M. P. (2012). Network centrality in the human functional connectome. Cerebral Cortex, 22(8), 1862-1875.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.