×

Deep reinforcement learning guided graph neural networks for brain network analysis. (English) Zbl 1542.92013

Summary: Modern neuroimaging techniques enable us to construct human brains as brain networks or connectomes. Capturing brain networks’ structural information and hierarchical patterns is essential for understanding brain functions and disease states. Recently, the promising network representation learning capability of graph neural networks (GNNs) has prompted related methods for brain network analysis to be proposed. Specifically, these methods apply feature aggregation and global pooling to convert brain network instances into vector representations encoding brain structure induction for downstream brain network analysis tasks. However, existing GNN-based methods often neglect that brain networks of different subjects may require various aggregation iterations and use GNN with a fixed number of layers to learn all brain networks. Therefore, how to fully release the potential of GNNs to promote brain network analysis is still non-trivial. In our work, a novel brain network representation framework, BN-GNN, is proposed to solve this difficulty, which searches for the optimal GNN architecture for each brain network. Concretely, BN-GNN employs deep reinforcement learning (DRL) to automatically predict the optimal number of feature propagations (reflected in the number of GNN layers) required for a given brain network. Furthermore, BN-GNN improves the upper bound of traditional GNNs’ performance in eight brain network disease analysis tasks.

MSC:

92B20 Neural networks for/in biological studies, artificial life and related topics
68T07 Artificial neural networks and deep learning

References:

[1] Alexander, A. L.; Lee, J. E.; Lazar, M.; Field, A. S., Diffusion tensor imaging of the brain, Neurotherapeutics, 4, 3, 316-329 (2007)
[2] Arslan, S.; Ktena, S. I.; Glocker, B.; Rueckert, D., Graph saliency maps through spectral convolutional networks: Application to sex classification with brain connectivity, (Graphs in biomedical image analysis and integrating medical imaging and non-imaging modalities (2018), Springer), 3-13
[3] Arulkumaran, K.; Deisenroth, M. P.; Brundage, M.; Bharath, A. A., Deep reinforcement learning: A brief survey, IEEE Signal Processing Magazine, 34, 6, 26-38 (2017)
[4] Bi, X.; Liu, Z.; He, Y.; Zhao, X.; Sun, Y.; Liu, H., GNEA: A graph neural network with ELM aggregator for brain network classification, Complexity, 2020 (2020)
[5] Braak, H.; Braak, E., Neuropathological stageing of Alzheimer-related changes, Acta Neuropathologica, 82, 4, 239-259 (1991)
[6] Cao, B.; He, L.; Wei, X.; Xing, M.; Yu, P. S.; Klumpp, H., t-bne: Tensor-based brain network embedding, (International conference on data mining (2017), SIAM), 189-197
[7] Cao, B.; Zhan, L.; Kong, X.; Yu, P. S.; Vizueta, N.; Altshuler, L. L., Identification of discriminative subgraph patterns in fMRI brain networks in bipolar affective disorder, (International conference on brain informatics and health (2015), Springer), 105-114
[8] Chen, D.; Lin, Y.; Li, W.; Li, P.; Zhou, J.; Sun, X., Measuring and relieving the over-smoothing problem for graph neural networks from the topological view, (Proceedings of the AAAI conference on artificial intelligence, Vol. 34 (2020)), 3438-3445
[9] Chen, J., Ma, T., & Xiao, C. (2018). FastGCN: Fast learning with graph convolutional networks via importance sampling. In International conference on learning representations.
[10] Craddock, R. C.; James, G. A.; Holtzheimer III, P. E.; Hu, X. P.; Mayberg, H. S., A whole brain fMRI atlas generated via spatially constrained spectral clustering, Human Brain Mapping, 33, 8, 1914-1928 (2012)
[11] Dou, Y., Liu, Z., Sun, L., Deng, Y., Peng, H., & Yu, P. S. (2020). Enhancing graph neural network-based fraud detectors against camouflaged fraudsters. In Proceedings of the ACM international conference on information knowledge management (pp. 315-324).
[12] Gao, H.; Ji, S., Graph u-nets, (International conference on machine learning (2019), PMLR), 2083-2092
[13] Gao, Y.; Yang, H.; Zhang, P.; Zhou, C.; Hu, Y., Graph neural architecture search, (International joint conference on artificial intelligence, Vol. 20 (2020)), 1403-1409
[14] Grover, A., & Leskovec, J. (2016). Node2vec: Scalable feature learning for networks. In Proceedings of the ACM SIGKDD international conference on knowledge discovery data mining (pp. 855-864).
[15] Gurbuz, M. B.; Rekik, I., MGN-Net: A multi-view graph normalizer for integrating heterogeneous biological network populations, Medical Image Analysis, 71, Article 102059 pp. (2021)
[16] Hamilton, W. L., Ying, Z., & Leskovec, J. (2017). Inductive representation learning on large graphs. In NeurIPS (pp. 1025-1035).
[17] Hernandez-Perez, H.; Mikiel-Hunter, J.; McAlpine, D.; Dhar, S.; Boothalingam, S.; Monaghan, J. J.M., Understanding degraded speech leads to perceptual gating of a brainstem reflex in human listeners, PLOS Biology, 19, 1-37 (2021)
[18] Huang, Y.; Mucke, L., Alzheimer mechanisms and therapeutic strategies, Cell, 148, 6, 1204-1222 (2012)
[19] Huettel, S. A.; Song, A. W.; McCarthy, G., Functional magnetic resonance imaging, Vol. 1 (2004), Sinauer Associates Sunderland: Sinauer Associates Sunderland MA
[20] Jiang, H.; Cao, P.; Xu, M.; Yang, J.; Zaiane, O., Hi-GCN: A hierarchical graph convolution network for graph embedding learning of brain network and brain disorders prediction, Computers in Biology and Medicine, 127, Article 104096 pp. (2020)
[21] Kipf, T. N., & Welling, M. (2017). Semi-supervised classification with graph convolutional networks. In International conference on learning representations.
[22] Krizhevsky, A.; Sutskever, I.; Hinton, G. E., Imagenet classification with deep convolutional neural networks, Advances in Neural Information Processing Systems, 25, 1097-1105 (2012)
[23] Ktena, S. I.; Parisot, S.; Ferrante, E.; Rajchl, M.; Lee, M.; Glocker, B., Metric learning with spectral graph convolutions on brain connectivity networks, NeuroImage, 169, 431-442 (2018)
[24] Lai, K. -H., Zha, D., Zhou, K., & Hu, X. (2020). Policy-GNN: Aggregation optimization for graph neural networks. In Proceedings of the ACM SIGKDD international conference on knowledge discovery data mining (pp. 461-471).
[25] LeCun, Y.; Bengio, Y.; Hinton, G., Deep learning, Nature, 521, 7553, 436-444 (2015)
[26] Li, G., Muller, M., Thabet, A., & Ghanem, B. (2019). Deepgcns: Can gcns go as deep as cnns? In Proceedings of the IEEE/CVF international conference on computer vision (pp. 9267-9276).
[27] Li, X.; Zhou, Y.; Dvornek, N. C.; Zhang, M.; Zhuang, J.; Ventola, P., Pooling regularized graph neural network for fmri biomarker analysis, (International conference on medical image computing and computer-assisted intervention (2020), Springer), 625-635
[28] Liu, Y.; He, L.; Cao, B.; Yu, P. S.; Ragin, A. B.; Leow, A. D., Multi-view multi-graph embedding for brain network clustering analysis, (Proceedings of the AAAI conference on artificial intelligence, Vol. 32 (2018), AAAI Press), 117-124
[29] Liu, F., Xue, S., Wu, J., Zhou, C., Hu, W., & Paris, C., et al. (2020). Deep learning for community detection: Progress, challenges and opportunities. In International joint conference on artificial intelligence (pp. 4981-4987).
[30] Ma, G., Ahmed, N. K., Willke, T. L., Sengupta, D., Cole, M. W., & Turk-Browne, N. B., et al. (2019). Deep graph similarity learning for brain data analysis. In Proceedings of the ACM international conference on information and knowledge management (pp. 2743-2751).
[31] Ma, G., He, L., Lu, C. -T., Shao, W., Yu, P. S., & Leow, A. D., et al. (2017). Multi-view clustering with graph embedding for connectome analysis. In Proceedings of the ACM international conference on information knowledge management (pp. 127-136).
[32] Ma, X.; Wu, J.; Xue, S.; Yang, J.; Zhou, C.; Sheng, Q. Z., A comprehensive survey on graph anomaly detection with deep learning, IEEE Transactions on Knowledge and Data Engineering (2021)
[33] Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013). Efficient estimation of word representations in vector space. In International conference on learning representations.
[34] Mnih, V.; Kavukcuoglu, K.; Silver, D.; Rusu, A. A.; Veness, J.; Bellemare, M. G., Human-level control through deep reinforcement learning, Nature, 518, 7540, 529-533 (2015)
[35] Nishi, T.; Otaki, K.; Hayakawa, K.; Yoshimura, T., Traffic signal control based on reinforcement learning with graph convolutional neural nets, (International conference on intelligent transportation systems (2018), IEEE), 877-883
[36] Nolte, G.; Bai, O.; Wheaton, L.; Mari, Z.; Vorbach, S.; Hallett, M., Identifying true brain interaction from EEG data using the imaginary part of coherency, Clinical Neurophysiology, 115, 10, 2292-2307 (2004)
[37] Oono, K., & Suzuki, T. (2019). Graph neural networks exponentially lose expressive power for node classification. In International conference on learning representations.
[38] Oostenveld, R.; Fries, P.; Maris, E.; Schoffelen, J.-M., FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data, Computational Intelligence and Neuroscience, 2011 (2011)
[39] Pan, S.; Wu, J.; Zhu, X.; Long, G.; Zhang, C., Task sensitive feature exploration and learning for multitask graph classification, IEEE Transactions on Cybernetics, 47, 3, 744-758 (2016)
[40] Parisot, S.; Ktena, S. I.; Ferrante, E.; Lee, M.; Guerrero, R.; Glocker, B., Disease prediction using graph convolutional networks: Application to autism spectrum disorder and Alzheimer’s disease, Medical Image Analysis, 48, 117-130 (2018)
[41] Peng, H.; Li, J.; Gong, Q.; Ning, Y.; Wang, S.; He, L., Motif-matching based subgraph-level attentional convolutional network for graph classification, (Proceedings of the AAAI conference on artificial intelligence, Vol. 34 (2020)), 5387-5394
[42] Peng, H.; Zhang, R.; Dou, Y.; Yang, R.; Zhang, J.; Yu, P. S., Reinforced neighborhood selection guided multi-relational graph neural networks, ACM Transactions on Information Systems, 1-46 (2021)
[43] Peng, H.; Zhang, R.; Li, S.; Cao, Y.; Pan, S.; Yu, P., Reinforced, incremental and cross-lingual event detection from social messages, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1 (2022)
[44] Perozzi, B., Al-Rfou, R., & Skiena, S. (2014). Deepwalk: Online learning of social representations. In Proceedings of the ACM SIGKDD international conference on knowledge discovery data mining (pp. 701-710).
[45] Ragin, A. B.; Du, H.; Ochs, R.; Wu, Y.; Sammet, C. L.; Shoukry, A., Structural brain alterations can be detected early in HIV infection, Neurology, 79, 24, 2328-2334 (2012)
[46] Smith, S. M.; Jenkinson, M.; Woolrich, M. W.; Beckmann, C. F.; Behrens, T. E.; Johansen-Berg, H., Advances in functional and structural MR image analysis and implementation as FSL, Neuroimage, 23, S208-S219 (2004)
[47] Sun, Q., Li, J., Peng, H., Wu, J., Ning, Y., & Yu, P. S., et al. (2021). SUGAR: Subgraph neural network with reinforcement pooling and self-supervised mutual information mechanism. In Proceedings of the web conference (pp. 2081-2091).
[48] Tzourio-Mazoyer, N.; Landeau, B.; Papathanassiou, D.; Crivello, F.; Etard, O.; Delcroix, N., Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain, Neuroimage, 15, 1, 273-289 (2002)
[49] Urbanski, M.; De Schotten, M. T.; Rodrigo, S., Brain networks of spatial awareness: Evidence from diffusion tensor imaging tractography, Journal of Neurology Neurosurgery and Psychiatry, 79, 5, 598-601 (2008)
[50] Van Den Heuvel, M. P.; Pol, H. E.H., Exploring the brain network: A review on resting-state fMRI functional connectivity, European Neuropsychopharmacology, 20, 8, 519-534 (2010)
[51] Van Hasselt, H.; Guez, A.; Silver, D., Deep reinforcement learning with double q-learning, (Proceedings of the AAAI conference on artificial intelligence, Vol. 30 (2016))
[52] Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., & Bengio, Y. (2018). Graph attention networks. In International conference on learning representations.
[53] Wang, S., He, L., Cao, B., Lu, C. -T., Yu, P. S., & Ragin, A. B. (2017). Structural deep brain network mining. In Proceedings of the ACM SIGKDD international conference on knowledge discovery data mining (pp. 475-484).
[54] Whitfield-Gabrieli, S.; Nieto-Castanon, A., Conn: A functional connectivity toolbox for correlated and anticorrelated brain networks, Brain Connectivity, 2, 3, 125-141 (2012)
[55] Xing, X.; Li, Q.; Yuan, M.; Wei, H.; Xue, Z.; Wang, T., DS-GCNs: Connectome classification using dynamic spectral graph convolution networks with assistant task training, Cerebral Cortex, 31, 2, 1259-1269 (2021)
[56] Yan, Z.; Ge, J.; Wu, Y.; Li, L.; Li, T., Automatic virtual network embedding: A deep reinforcement learning approach with graph convolutional networks, IEEE Journal on Selected Areas in Communications, 38, 6, 1040-1057 (2020)
[57] Yan, C.; Zang, Y., DPARSF: A MATLAB toolbox for “pipeline” data analysis of resting-state fMRI, Frontiers in Systems Neuroscience, 4, 13 (2010)
[58] Zha, D., Lai, K. -H., Zhou, K., & Hu, X. (2019). Experience Replay Optimization. In International joint conference on artificial intelligence.
[59] Zhang, X.; He, L.; Chen, K.; Luo, Y.; Zhou, J.; Wang, F., Multi-view graph convolutional network and its applications on neuroimage analysis for Parkinson’s disease, (AMIA annual symposium proceedings, Vol. 2018 (2018), American Medical Informatics Association), 1147-1156
[60] Zhang, Y.; Huang, H., New graph-blind convolutional network for brain connectome data analysis, (International conference on information processing in medical imaging (2019), Springer), 669-681
[61] Zhong, P.; Wang, D.; Miao, C., EEG-based emotion recognition using regularized graph neural networks, IEEE Transactions on Affective Computing, 1 (2020)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.