Semi-supervised Classification Based on Graph Convolution Encoder Representations from BERT

J Zhang, Z Jiang, C Li, Z Wang�- International Conference on Advanced�…, 2023 - Springer
J Zhang, Z Jiang, C Li, Z Wang
International Conference on Advanced Data Mining and Applications, 2023Springer
Attention-based models have attracted crazy enthusiasm both in natural language
processing and graph processing. We propose a novel model called Graph Encoder
Representations from Transformers (GERT). Inspired by the similar distribution between
vertices in graphs and words in natural language, GERT utilizes the equivalent of sentences-
vertices obtained from truncated random walks to learn the local information of vertices.
Then, GERT combines the strengths of local information learned from random walks and�…
Abstract
Attention-based models have attracted crazy enthusiasm both in natural language processing and graph processing. We propose a novel model called Graph Encoder Representations from Transformers (GERT). Inspired by the similar distribution between vertices in graphs and words in natural language, GERT utilizes the equivalent of sentences-vertices obtained from truncated random walks to learn the local information of vertices. Then, GERT combines the strengths of local information learned from random walks and long-distance dependence obtained from transformer encoder models to represent latent features. Compared to other transformer models, the advantages of GERT include extracting local and global information, being suitable for homogeneous and heterogeneous networks, and possessing stronger strengths in extracting latent features. On top of GERT, we integrate convolution to extract information from the local neighbors and obtain another novel model Graph Convolution Encoder Representations from Transformers (GCERT). We demonstrate the effectiveness of proposed models on six networks DBLP, BlogCatalog, CiteSeerX, CoRE, Flickr, and PubMed. Evaluation results show that our models improve scores of current state-of-the-art methods up to .
Springer
Showing the best result for this search. See all results