Abstract:
Contrastive dimensionality reduction (CDR) has demonstrated significant value in visual cluster analysis due to its exceptional capabilities in cluster separation and neighborhood structure preservation. However, in the context of federated learning (FL), the training data held by each client often exhibits non-independent and identically distributed (non-IID) phenomenon, which will lead to local bias when clients update the CDR model. To alleviate this problem, we propose a federated contrastive dimensionality reduction algorithm (FedCDR). First, collaborative training across multiple client models is employed to improve the generalization capability of the model in visual clustering tasks. Second, a model contrastive loss is introduced into the local objective function to mitigate visual clustering bias caused by non-IID data. Finally, an adaptive temperature regulation agent based on proximal policy optimization is designed to further enhance the model’s adaptability to different data distributions. Quantitative experiments on three public datasets demonstrate that FedCDR outperforms baselines by 16% in neighbor hit and 9% in
k-NN classifier accuracy.