Indexed by:
Abstract:
Contrastive learning is widely used in deep image clustering due to its ability to learn discriminative representations. However, some studies simply combined contrastive learning with clustering. This line of works often ignores semantic meaningful representations and leads to suboptimal performance. In this paper, we propose a new deep image clustering framework called Nearest Neighbor Contrastive Clustering (NNCC), which fuses contrastive learning with neighbor relation mining. During training, contrastive learning and neighbor relation mining are updated alternately, where the former is conducted in the backward pass, while the latter is employed in the forward pass. Specially, we empirically find that data augmentation is an effective technique for generating nearest neighbors manually. A stronger data augmentation means more nearest neighbors involved for learning powerful discriminative representations in the contrastive learning. Due to effective neighbor relation mining, the proposed framework learns more semantic meaningful representations with contrastive learning and obtains more accurate image clusters. Through experimental results on six image datasets, the proposed framework defeats compared state-of-the-arts clustering methods. (c) 2021 Elsevier B.V. All rights reserved.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
KNOWLEDGE-BASED SYSTEMS
ISSN: 0950-7051
Year: 2022
Volume: 238
8 . 8
JCR@2022
7 . 2 0 0
JCR@2023
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:61
JCR Journal Grade:1
CAS Journal Grade:2
Cited Count:
WoS CC Cited Count: 19
SCOPUS Cited Count: 18
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: