An Empirical Study into Clustering of Unseen Datasets with Self-Supervised Encoders

VectorVector Institute AAUAalborg University PioneerPioneer Centre for AI
DalDalhousie University UoGUniversity of Guelph


*Equal Contribution Equal Supervision

Abstract

Can pretrained models generalize to new datasets without any retraining? We deploy pretrained image models on datasets they were not trained for, and investigate whether their embeddings form meaningful clusters. Our suite of benchmarking experiments use encoders pretrained solely on ImageNet-1k with either supervised or self-supervised training techniques, deployed on image datasets that were not seen during training, and clustered with conventional clustering algorithms. This evaluation provides new insights into the embeddings of self-supervised models, which prioritize different features to supervised models. Supervised encoders typically offer more utility than SSL encoders within the training domain, and the reverse far outside of it, however fine-tuned encoders demonstrate the opposite trend. Clustering provides a self-supervised learning evaluation method orthogonal to existing methods such as kNN. Additionally, we find the silhouette score when measured in a UMAP-reduced space is highly correlated with clustering performance, and can therefore be used as a proxy for clustering performance on data with no ground truth labels.

BibTeX

@article{zero-shot-clustering,
    title={An Empirical Study into Clustering of Unseen Datasets with Self-Supervised Encoders},
    author={Scott C. Lowe and Joakim Bruslund Haurum and Sageev Oore and Thomas B. Moeslund and Graham W. Taylor},
    year={2024},
    eprint={2406.02465},
    archivePrefix={arXiv},
    primaryClass={cs.LG},
    journal={arXiv preprint},
    doi={10.48550/arxiv.2406.02465},
}