×

Joint hyperbolic and Euclidean geometry contrastive graph neural networks. (English) Zbl 07825383

Summary: Graph Neural Networks (GNNs) have demonstrated state-of-the-art performance in a wide variety of analytical tasks. Current GNN approaches focus on learning representations in a Euclidean space, which are effective in capturing non-tree-like structural relations, but they fail to model complex relations in many real-world graphs, such as tree-like hierarchical graph structure. This paper instead proposes to learn representations in both Euclidean and hyperbolic spaces to model these two types of graph geometries. To this end, we introduce a novel approach - Joint hyperbolic and Euclidean geometry contrastive graph neural networks (JointGMC). JointGMC is enforced to learn multiple layer-wise optimal combinations of Euclidean and hyperbolic geometries to effectively encode diverse complex graph structures. Further, the performance of most GNNs relies heavily on the availability of large-scale manually labeled data. To mitigate this issue, JointGMC exploits proximity-based self-supervised information in different geometric spaces (i.e., Euclidean, hyperbolic, and Euclidean-hyperbolic interaction spaces) to regularize the (semi-) supervised graph learning. Extensive experimental results on eight real-world graph datasets show that JointGMC outperforms eight state-of-the-art GNN models in diverse graph mining tasks, including node classification, link prediction, and node clustering tasks, demonstrating JointGMC’s superior graph representation ability. Code is available at https://github.com/chachatang/jointgmc.

MSC:

68T07 Artificial neural networks and deep learning

Software:

SimCLR
Full Text: DOI

References:

[1] Zhang, D.; Yin, J.; Zhu, X.; Zhang, C., Network representation learning: a survey, IEEE Trans. Big Data, 6, 1, 3-28 (2020)
[2] Dornaika, F., On the use of high-order feature propagation in graph convolution networks with manifold regularization, Inf. Sci., 584, 467-478 (2022)
[3] Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Yu, P. S., A comprehensive survey on graph neural networks, IEEE Trans. Neural Networks Learn. Syst., 32, 1, 1-21 (2021)
[4] Cai, H.; Zheng, V. W.; Chang, K.-C.-C., A comprehensive survey of graph embedding: Problems, techniques, and applications, IEEE Trans. Knowl. Data Eng., 30, 09, 1616-1637 (2018)
[5] Goyal, P.; Ferrara, E., Graph embedding techniques, applications, and performance: a survey, Knowl.-Based Syst., 151, 78-94 (2018)
[6] Scarselli, F.; Gori, M.; Tsoi, A. C.; Hagenbuchner, M.; Monfardini, G., The graph neural network model, IEEE Trans. Neural Networks, 20, 1, 61-80 (2009)
[7] Wang, J.; Liang, J.; Cui, J.; Liang, J., Semi-supervised learning with mixed-order graph convolutional networks, Inf. Sci., 573, 171-181 (2021)
[8] O. E. Ganea, G. Becigneul, T. Hofmann, Hyperbolic neural networks, Processing of the 32nd NIPS, Montreal, CANADA, (2018) 5350-5360.
[9] I. Chami, R. Ying, C. Re, J. Leskovec, Hyperbolic graph convolutional neural networks, Processing of the 33rd NIPS, Vancouver, CANADA, (2019) 4869-4880.
[10] Zhu, S.; Pan, S.; Zhou, C.; Wu, J.; Cao, Y.; Wang, B., Graph geometry interaction learning, (Proceedings of the 34th NIPS (2020)), 1-11
[11] Y. Zhang, X. Wang, C. Shi, N. Liu, and G. Song, Lorentzian graph convolutional networks, Proceedings of the 30th WWW, Ljubljana, Slovenia, (2021) 1249-1261.
[12] Li, A.; Yang, B.; Hussain, F. K.; Huo, H., Hsr: Hyperbolic social recommender, Inf. Sci., 585, 275-288 (2022) · Zbl 1532.91077
[13] Cadini, F.; Lomazzi, L.; Roca, M. F.; Sbarufatti, C.; Giglio, M., Neutralization of temperature effects in damage diagnosis of mdof systems by combinations of autoencoders and particle filters, Mech. Syst. Sig. Process., 162, Article 108048 pp. (2022)
[14] Jin, W.; Derr, T.; Wang, Y.; Ma, Y.; Liu, Z.; Tang, J., Node similarity preserving graph convolutional networks, (Proceedings of the 21th WSDM (2021)), 148-156
[15] Kim, D.; Oh, A. H., How to find your friendly neighborhood: Graph attention design with self-supervision, (Proceedings of the 9th ICLR (2021)), 1-25
[16] Zhu, Y.; Xu, Y.; Yu, F.; Liu, Q.; Wu, S.; Wang, L., Graph contrastive learning with adaptive augmentation, (Proceedings of the 21th WWW (2021)), 2069-2080
[17] S. Wan, S. Pan, J. Yang, C. Gong, Contrastive and generative graph convolutional networks for graph-based semi-supervised learning, Proceedings of the 35th AAAI, (2021) 10049-10057.
[18] X. Liu et al., Self-supervised learning: Generative or contrastive, IEEE Transactions on Knowledge and Data Engineering, (2021) 1-1.
[19] M. Defferrard, X. Bresson, P. Vandergheynst, Convolutional neural networks on graphs with fast localized spectral filtering, Processing of the 30th NIPS, Barcelona, SPAIN, (2016) 3837-3845.
[20] Kipf, T. N.; Welling, M., Semi-supervised classification with graph convolutional networks, (Proceedings of the 5th ICLR (2017)), 1-14
[21] W. L. Hamilton, R. Ying, J. Leskovec, Inductive representation learning on large graphs, Proceedings of the 31st NIPS, Long Beach, CA, (2017) 1024-1034.
[22] Wu, F.; Jr, A. H.S.; Zhang, T.; Fifty, C.; Yu, T.; Weinberger, K. Q., Simplifying graph convolutional networks, (Proceedings of the 36th ICML (2019)), 6861-6871
[23] D. Bo, X. Wang, C. Shi, and H. Shen, Beyond low-frequency information in graph convolutional networks, Proceedings of the 35th AAAI, (2021) 3950-3957.
[24] L. Wang, Y. Lu, C. Huang, and S. Vosoughi, Embedding node structural role identity into hyperbolic space, Proceedings of the 29th CIKM, Virtual Event, Ireland, (2020) 2253-2256.
[25] M. Nickel and D. Kiela, Poincare embeddings for learning hierarchical representations, Proceedings of the 31st NIPS, Long Beach, CA, (2017) 6338-6347.
[26] M. Nickel, D. Kiela, Learning continuous hierarchies in the lorentz model of hyperbolic geometry, Proceedings of the 35th ICML, Stockholm, SWEDEN, (2018) 3776-3785.
[27] H. Lang, H. Poon, Self-supervised self-supervision by combining deep learning and probabilistic logic, Proceedings of the 21st AAAI, (2021) 4978-4986.
[28] H. Wang et al., Self-supervised learning for contextualized extractive summarization, Proceedings of the 57th ACL, Florence, Italy, (2019) 2221-2227.
[29] Y. Wang, Y. Min, X. Chen, J. Wu, Multi-view graph contrastive representation learning for drug-drug interaction prediction, Proceedings of the 30th WWW, Ljubljana, Slovenia, (2021) 2921-2933.
[30] T. Chen, S. Kornblith, M. Norouzi, G. Hinton, A simple framework for contrastive learning of visual representations, Proceedings of the ICML, Electr Network, (2020) 1-11.
[31] Hassani, K.; Khasahmadi, A. H., Contrastive multi-view representation learning on graphs, (Proceedings of the 37th ICML (2020)), 4116-4126
[32] Chen, J.; Yuan, S. F.; Sbarufatti, C.; Jin, X., Dual crack growth prognosis by using a mixture proposal particle filter and on-line crack monitoring, Reliab. Eng. Syst. Saf., 215, Article 107758 pp. (2021)
[33] X. Wang, M. Zhu, D. Bo, P. Cui, C. Shi, J. Pei, Am-gcn: Adaptive multi-channel graph convolutional networks, Proceedings of the 26th SIGKDD, Virtual Event, CA, USA, (2020) 1243-1253.
[34] R. Wang et al., Graph structure estimation neural networks, Proceedings of the 30th WWW, Ljubljana, Slovenia, (2021) 342-353.
[35] M. Gromov, Hyperbolic groups, Essays in Group Theory, (1987) 75-263. · Zbl 0634.20015
[36] Demsar, J., Statistical comparisons of classifiers over multiple data sets, (in English), J. Machine Learn. Res., 7, 1-30 (2006) · Zbl 1222.68184
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.