Wang, Boyuan, Cui, Lixin, Bai, Lu et al. (1 more author) (2021) Graph Transformer: Learning Better Representations for Graph Neural Networks. In: Torsello, Andrea, Rossi, Luca, Pelillo, Marcello, Biggio, Battista and Robles-Kelly, Antonio, (eds.) Structural, Syntactic, and Statistical Pattern Recognition. Springer , Cham , pp. 139-149.
Abstract
Graph classifications are significant tasks for many real-world applications. Recently, Graph Neural Networks (GNNs) have achieved excellent performance on many graph classification tasks. However, most state-of-the-art GNNs face the challenge of the over-smoothing problem and cannot learn latent relations between distant vertices well. To overcome this problem, we develop a novel Graph Transformer (GT) unit to learn latent relations timely. In addition, we propose a mixed network to combine different methods of graph learning. We elucidate that the proposed GT unit can both learn distant latent connections well and form better representations for graphs. Moreover, the proposed Graph Transformer with Mixed Network (GTMN) can learn both local and global information simultaneously. Experiments on standard graph classification benchmarks demonstrate that our proposed approach performs better when compared with other competing methods.
Metadata
Item Type: | Proceedings Paper |
---|---|
Authors/Creators: |
|
Editors: |
|
Dates: |
|
Institution: | The University of York |
Academic Units: | The University of York > Faculty of Sciences (York) > Computer Science (York) |
Depositing User: | Pure (York) |
Date Deposited: | 15 Apr 2021 10:00 |
Last Modified: | 17 Mar 2025 00:13 |
Published Version: | https://doi.org/10.1007/978-3-030-73973-7_14 |
Status: | Published |
Publisher: | Springer |
Identification Number: | 10.1007/978-3-030-73973-7_14 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:173119 |