Indexed by:
Abstract:
Deep clustering, as an important research topic in machine learning and data mining, has been widely applied in many real-world scenarios. However, existing deep clustering methods primarily rely on implicit optimization objectives such as contrastive learning or reconstruction, which do not explicitly enforce cluster-level discrimination. This limitation restricts their ability to achieve compact intra-cluster structures and distinct inter-cluster separations. To overcome this limitation, we propose a novel unsupervised discriminative deep clustering (discDC) method, which explicitly integrates cluster-level discrimination into the learning process. The proposed discDC framework projects data into a nonlinear latent space with compact and well-separated cluster representations. It explicitly optimizes clustering objectives by minimizing intra-cluster discrepancy and maximizing inter-cluster discrepancy. Additionally, to tackle the lack of label information in unsupervised scenarios, we introduce a confidence-driven self-labeling mechanism, which iteratively derives reliable pseudo-labels to enhance discriminative analysis. Extensive experiments on five benchmark datasets demonstrate the superiority of discDC over state-of-the-art deep clustering approaches. © 2025 Elsevier Ltd
Keyword:
Reprint 's Address:
Email:
Source :
Pattern Recognition
ISSN: 0031-3203
Year: 2026
Volume: 172
7 . 5 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: