TY - GEN
T1 - A comparative experimental study of feature-weight learning approaches
AU - XING, Hong-Jie
AU - WANG, Xi-Zhao
AU - HA, Ming-Hu
N1 - This work is partly supported by the National Natural Science Foundation of China (No. 60903089; 61073121), the China Postdoctoral Science Foundation (No. 20080440820), the Natural Science Foundation of Hebei Province (No. F2009000231), the Postdoctoral Science Foundation of Hebei University, and the Foundation of Hebei University (No. 2008123).
PY - 2011
Y1 - 2011
N2 - Feature-weight learning (FWL) methods can be used to determine the importance degrees of each feature for constructing clusters or classifiers. In this paper, four FWL methods for unsupervised learning and two for supervised learning are surveyed. The FWL based models, i.e. feature-weighted fuzzy c-means (FWFCM) and feature-weighted support vector machine (FWSVM) are also reviewed. Through carefully selected experiments we find that FWFCM and FWSVM may improve the performances of their corresponding traditional fuzzy c-mean (FCM) and support vector machine (SVM), respectively. Moreover, the computational cost of FWL-Hung is least for unsupervised learning even though it may produce unsuitable feature weights in some extreme cases, while FWL-MI is most effective for supervised learning.
AB - Feature-weight learning (FWL) methods can be used to determine the importance degrees of each feature for constructing clusters or classifiers. In this paper, four FWL methods for unsupervised learning and two for supervised learning are surveyed. The FWL based models, i.e. feature-weighted fuzzy c-means (FWFCM) and feature-weighted support vector machine (FWSVM) are also reviewed. Through carefully selected experiments we find that FWFCM and FWSVM may improve the performances of their corresponding traditional fuzzy c-mean (FCM) and support vector machine (SVM), respectively. Moreover, the computational cost of FWL-Hung is least for unsupervised learning even though it may produce unsuitable feature weights in some extreme cases, while FWL-MI is most effective for supervised learning.
KW - Feature weighting
KW - Feature-weight learning
KW - Feature-weighted fuzzy c-means
KW - Feature-weighted support vector machine
UR - http://www.scopus.com/inward/record.url?scp=83755194773&partnerID=8YFLogxK
U2 - 10.1109/ICSMC.2011.6084211
DO - 10.1109/ICSMC.2011.6084211
M3 - Conference paper (refereed)
AN - SCOPUS:83755194773
SN - 9781457706530
T3 - IEEE International Conference on Systems, Man and Cybernetics
SP - 3500
EP - 3505
BT - Proceedings : 2011 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2011 - Conference Digest
PB - IEEE
T2 - 2011 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2011
Y2 - 9 October 2011 through 12 October 2011
ER -