Abstract
In this article, we propose a novel model for constrained clustering, namely, the dissimilarity propagation-guided graph-Laplacian principal component analysis (DP-GLPCA). By fully utilizing a limited number of weakly supervisory information in the form of pairwise constraints, the proposed DP-GLPCA is capable of capturing both the local and global structures of input samples to exploit their characteristics for excellent clustering. More specifically, we first formulate a convex semisupervised low-dimensional embedding model by incorporating a new dissimilarity regularizer into GLPCA (i.e., an unsupervised dimensionality reduction model), in which both the similarity and dissimilarity between low-dimensional representations are enforced with the constraints to improve their discriminability. An efficient iterative algorithm based on the inexact augmented Lagrange multiplier is designed to solve it with the global convergence guaranteed. Furthermore, we innovatively propose to propagate the cannot-link constraints (i.e., dissimilarity) to refine the dissimilarity regularizer to be more informative. The resulting DP model is iteratively solved, and we also prove that it can converge to a Karush-Kuhn-Tucker point. Extensive experimental results over nine commonly used benchmark data sets show that the proposed DP-GLPCA can produce much higher clustering accuracy than state-of-the-art constrained clustering methods. Besides, the effectiveness and advantage of the proposed DP model are experimentally verified. To the best of our knowledge, it is the first time to investigate DP, which is contrast to existing pairwise constraint propagation that propagates similarity. The code is publicly available at https://github.com/jyh-learning/DP-GLPCA.
Original language | English |
---|---|
Article number | 9178787 |
Pages (from-to) | 3985-3997 |
Number of pages | 13 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 32 |
Issue number | 9 |
Early online date | 27 Aug 2020 |
DOIs | |
Publication status | Published - Sept 2021 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2012 IEEE.
Funding
This work was supported in part by the Natural Science Foundation of China under Grant 61871342, Grant 61772344, and Grant 61672443; in part by the Hong Kong RGC General Research Funds under Grant 9042820 (CityU 11219019), Grant 9042955 (CityU 11202320), Grant 9042816 (CityU 11209819), and Grant 9048123 (CityU 21211518); and in part by the Key Project of Science and Technology Innovation 2030 supported by the Ministry of Science and Technology of China under Grant 2018AAA0101301. This article was presented in part at the IEEE International Conference on Multimedia and Expo (ICME) 2018.
Keywords
- Constrained clustering
- convergence
- convex relaxation
- dissimilarity propagation (DP)
- Karush-Kuhn-Tucker (KKT)