An off-center technique : Learning a feature transformation to improve the performance of clustering and classification

Dasen YAN, Xinlei ZHOU, Xizhao WANG, Ran WANG*

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

8 Citations (Scopus)

Abstract

This paper proposes a feature transformation method to improve the performance of clustering and classification, which is named as weight-matrix learning (WML). A feed-forward neural network is particularly designed for WML, which aims to learn the optimal weights by minimizing an objective function similar to cross-entropy, and the training process is finished based on the technique of batch gradient descent or stochastic gradient descent. The proposed feature transformation is linear, which is a non-trivial extension of a previous technique named feature-weight learning (FWL). Essentially, WML can be considered as a learning technique of departing 0.5-similarity, since it can make the samples with similarity larger than 0.5 closer and the samples with similarity lower than 0.5 farther away. From this perspective, WML is identified as an off-center technique with the center of 0.5-similarity. Theoretically and experimentally, it is validated that WML can significantly improve the performance of some clustering algorithms like k-means, and enhance the performance of some classification algorithms like random weight neural network.

Original languageEnglish
Pages (from-to)635-651
Number of pages17
JournalInformation Sciences
Volume503
Early online date11 Jul 2019
DOIs
Publication statusPublished - Nov 2019
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2019 Elsevier Inc.

Keywords

  • Feature transformation
  • Feed-forward neural network
  • Similarity matrix

Fingerprint

Dive into the research topics of 'An off-center technique : Learning a feature transformation to improve the performance of clustering and classification'. Together they form a unique fingerprint.

Cite this