×

LDSPool: latent Dirichlet structure pooling with hierarchical graph context representation. (English) Zbl 07916132

Summary: In graph data analysis, particularly the graph classification task, a discriminative graph-level representation is significant to improve classification performance. For the performance improvement, recent studies have applied pooling methods to graph neural networks. However, the existing pooling approaches lose the initial graph structural information when incorporating each node. When a latent structure obtained from the pooling operation is given, the nodes in each latent structure have a different significance compared with the original graph. This structural information discrepancy between initial and latent structures leads to an inadequate graph representation when the existing methods generate the graph result. Motivated by this, we propose latent Dirichlet structure pooling (LDSPool) with a hierarchical graph context representation. After independently learning both structures, the proposed LDSPool incorporates the original graph and latent structure to reduce the structural discrepancy. We employ latent Dirichlet allocation (LDA) to extract the latent structure from the original graph. To validate the performance of LDSPool, we benchmarked five public graph datasets and seven existing pooling methods. The experimental results and ablation studies demonstrate that LDSPool achieves better performance than previous methods.

MSC:

68-XX Computer science
05-XX Combinatorics
Full Text: DOI

References:

[1] Blei, D. M.; Ng, A. Y.; Jordan, M. I., Latent Dirichlet allocation, J. Mach. Learn. Res., 3, 993-1022, 2003 · Zbl 1112.68379
[2] Borgwardt, K. M.; Ong, C. S.; Schönauer, S.; Vishwanathan, S.; Smola, A. J.; Kriegel, H. P., Protein function prediction via graph kernels, Bioinformatics, 21, i47-i56, 2005
[3] Cangea, C.; Veličković, P.; Jovanović, N.; Kipf, T.; Liò, P., Towards sparse hierarchical graph classifiers, 2018, arXiv preprint
[4] Chen, K.; Liu, S.; Yu, N.; Yan, R.; Zhang, Q.; Song, J.; Feng, Z.; Song, M., Distribution-aware graph representation learning for transient stability assessment of power system, (2022 International Joint Conference on Neural Networks (IJCNN), IEEE, 2022), 1-8
[5] Chen, K.; Song, J.; Liu, S.; Yu, N.; Feng, Z.; Han, G.; Song, M., Distribution knowledge embedding for graph pooling, IEEE Trans. Knowl. Data Eng., 2022
[6] Dhillon, I. S.; Guan, Y.; Kulis, B., Weighted graph cuts without eigenvectors a multilevel approach, IEEE Trans. Pattern Anal. Mach. Intell., 29, 1944-1957, 2007
[7] Dobson, P. D.; Doig, A. J., Distinguishing enzyme structures from non-enzymes without alignments, J. Mol. Biol., 330, 771-783, 2003
[8] Fey, M.; Lenssen, J. E., Fast graph representation learning with pytorch geometric, 2019, arXiv preprint
[9] Gao, H.; Ji, S., Graph u-nets, (International Conference on Machine Learning, PMLR, 2019), 2083-2092
[10] Griffiths, T., Gibbs sampling in the generative model of Latent Dirichlet Allocation, 2002, Tech. Report
[11] He, K.; Zhang, X.; Ren, S.; Sun, J., Deep residual learning for image recognition, (Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016), 770-778
[12] He, K.; Zhang, X.; Ren, S.; Sun, J., Identity mappings in deep residual networks, (European Conference on Computer Vision, 2016, Springer), 630-645
[13] Henderson, K.; Eliassi-Rad, T., Applying latent Dirichlet allocation to group discovery in large graphs, (Proceedings of the 2009 ACM Symposium on Applied Computing, 2009), 1456-1461
[14] Hendrycks, D.; Gimpel, K., Gaussian error linear units (gelus), 2016, arXiv preprint
[15] Hu, W.; Fey, M.; Zitnik, M.; Dong, Y.; Ren, H.; Liu, B.; Catasta, M.; Leskovec, J., Open graph benchmark: datasets for machine learning on graphs, Adv. Neural Inf. Process. Syst., 33, 22118-22133, 2020
[16] Itoh, T. D.; Kubo, T.; Ikeda, K., Multi-level attention pooling for graph neural networks: unifying graph representations with multiple localities, Neural Netw., 145, 356-373, 2022
[17] Jin, W.; Ma, Y.; Liu, X.; Tang, X.; Wang, S.; Tang, J., Graph structure learning for robust graph neural networks, (Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2020), 66-74
[18] Kalchbrenner, N.; Grefenstette, E.; Blunsom, P., A convolutional neural network for modelling sentences, (Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics URL, 2014)
[19] Kingma, D. P.; Ba, J., Adam: a method for stochastic optimization, 2014, arXiv preprint
[20] Kipf, T. N.; Welling, M., Semi-supervised classification with graph convolutional networks, 2016, arXiv preprint
[21] Lee, J.; Lee, I.; Kang, J., Self-attention graph pooling, (International Conference on Machine Learning, PMLR, 2019), 3734-3743
[22] Lee, J. B.; Rossi, R.; Kong, X., Graph classification using structural attention, (Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018), 1666-1674
[23] Li, Q.; Han, Z.; Wu, X. M., Deeper insights into graph convolutional networks for semi-supervised learning, (Thirty-Second AAAI Conference on Artificial Intelligence, 2018)
[24] Liu, N.; Jian, S.; Li, D.; Xu, H., Unsupervised hierarchical graph pooling via substructure-sensitive mutual information maximization, (Proceedings of the 31st ACM International Conference on Information & Knowledge Management, 2022), 1299-1308
[25] Long, Q.; Jin, Y.; Song, G.; Li, Y.; Lin, W., Graph structural-topic neural network, (Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2020), 1065-1073
[26] Ma, Y.; Wang, S.; Aggarwal, C. C.; Tang, J., Graph convolutional networks with eigenpooling, (Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019), 723-731
[27] Miller, K. T.; Eliassi-Rad, T., Continuous time group discovery in dynamic graphs, (Notes of the 2009 NIPS Workshop on Analyzing Networks and Learning with Graphs. Notes of the 2009 NIPS Workshop on Analyzing Networks and Learning with Graphs, Whistler, BC, Canada, 2009)
[28] A. Paszke, S. Gross, S. Chintala, G. Chanan, E. Yang, Z. DeVito, Z. Lin, A. Desmaison, L. Antiga, A. Lerer, 2017, Automatic differentiation in pytorch.
[29] Ranjan, E.; Sanyal, S.; Talukdar, P., Asap: adaptive structure aware pooling for learning hierarchical graph representations, (Proceedings of the AAAI Conference on Artificial Intelligence, 2020), 5470-5477
[30] Stanovic, S.; Gaüzère, B.; Brun, L., Maximal independent vertex set applied to graph pooling, (Joint IAPR International Workshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR), 2022, Springer), 11-21
[31] Tang, J.; Zhang, J.; Yao, L.; Li, J.; Zhang, L.; Su, Z., Arnetminer: extraction and mining of academic social networks, (Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2008), 990-998
[32] Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y., Graph attention networks, 2017, arXiv preprint
[33] Vinyals, O.; Bengio, S.; Kudlur, M., Order matters: sequence to sequence for sets, 2015, arXiv preprint
[34] Wang, Z.; Ji, S., Second-order pooling for graph neural networks, IEEE Trans. Pattern Anal. Mach. Intell., 45, 6870-6880, 2020
[35] Wu, J.; Chen, X.; Xu, K.; Li, S., Structural entropy guided graph hierarchical pooling, (International Conference on Machine Learning, PMLR, 2022), 24017-24030
[36] Xu, K.; Hu, W.; Leskovec, J.; Jegelka, S., How powerful are graph neural networks?, 2018, arXiv preprint
[37] Xu, K.; Li, C.; Tian, Y.; Sonobe, T.; Kawarabayashi, K.i.; Jegelka, S., Representation learning on graphs with jumping knowledge networks, (International Conference on Machine Learning, PMLR, 2018), 5453-5462
[38] Xuan, J.; Lu, J.; Zhang, G.; Luo, X., Topic model for graph mining, IEEE Trans. Cybern., 45, 2792-2803, 2015
[39] Yanardag, P.; Vishwanathan, S., Deep graph kernels, (Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2015), 1365-1374
[40] Yang, C.; Wang, R.; Yao, S.; Liu, S.; Abdelzaher, T., Revisiting“over-smoothing” in deep gcns, 2020, arXiv preprint
[41] Ying, Z.; You, J.; Morris, C.; Ren, X.; Hamilton, W.; Leskovec, J., Hierarchical graph representation learning with differentiable pooling, (Advances in Neural Information Processing Systems, 2018), 4800-4810
[42] Yuan, H.; Ji, S., Structpool: structured graph pooling via conditional random fields, (Proceedings of the 8th International Conference on Learning Representations, 2020)
[43] Zhang, J.; Zhang, H.; Sun, L.; Xia, C., Graph-bert: only attention is needed for learning graph representations, 2020, arXiv preprint
[44] Zhang, M.; Cui, Z.; Neumann, M.; Chen, Y., An end-to-end deep learning architecture for graph classification, (Thirty-Second AAAI Conference on Artificial Intelligence, 2018)
[45] Zhang, Q.; Chang, J.; Meng, G.; Xu, S.; Xiang, S.; Pan, C., Learning graph structure via graph convolutional networks, Pattern Recognit., 95, 308-318, 2019
[46] Zhang, Z.; Bu, J.; Ester, M.; Zhang, J.; Yao, C.; Yu, Z.; Wang, C., Hierarchical graph pooling with structure learning, 2019, arXiv preprint
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.