Abstract
In this paper, we propose a one-stage online clustering method called
Contrastive Clustering (CC) which explicitly performs the instance- and
cluster-level contrastive learning. To be specific, for a given dataset, the
positive and negative instance pairs are constructed through data augmentations
and then projected into a feature space. Therein, the instance- and
cluster-level contrastive learning are respectively conducted in the row and
column space by maximizing the similarities of positive pairs while minimizing
those of negative ones. Our key observation is that the rows of the feature
matrix could be regarded as soft labels of instances, and accordingly the
columns could be further regarded as cluster representations. By simultaneously
optimizing the instance- and cluster-level contrastive loss, the model jointly
learns representations and cluster assignments in an end-to-end manner.
Extensive experimental results show that CC remarkably outperforms 17
competitive clustering methods on six challenging image benchmarks. In
particular, CC achieves an NMI of 0.705 (0.431) on the CIFAR-10 (CIFAR-100)
dataset, which is an up to 19\% (39\%) performance improvement compared with
the best baseline.
Users
Please
log in to take part in the discussion (add own reviews or comments).