Indexed by:
Abstract:
Contrastive learning has been widely used in graph representation learning, which extracts node or graph representations by contrasting positive and negative node pairs. It requires node representations (embeddings) to reflect their correlations in topology, increasing the similarities between an anchor node and its positive nodes, or reducing the similarities with its negative nodes in embedding space. However, most existing contrastive models measure similarities through a fixed metric that equally scores all sample pairs in a specific feature space, but ignores the varieties of node attributes and network topologies. Moreover, these fixed metrics are always defined explicitly and manually, which makes them unsuitable for applying to all graphs and networks. To solve these problems, we propose a novel graph representation learning model with an adaptive metric, called GRAM, which produces appropriate similarity scores of node pairs according to the different significance of each dimension in their embedding vectors and adaptive metrics based on data distribution. With these scores, it is better to train a graph encoder and obtain representative embeddings. Experimental results show that GRAM has strong competitiveness in multiple tasks.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING
ISSN: 2327-4697
Year: 2023
Issue: 4
Volume: 10
Page: 2074-2085
6 . 7
JCR@2023
6 . 7 0 0
JCR@2023
ESI Discipline: ENGINEERING;
ESI HC Threshold:35
JCR Journal Grade:1
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 4
SCOPUS Cited Count: 5
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: