Feature selection using localized generalization error for supervised classification problems using RBFNN

W. Y., Wing NG, Daniel S. YEUNG, Michael FIRTH, C. C, Eric TSANG, Xi Zhao WANG

Research output: Journal PublicationsJournal Article (refereed)peer-review

82 Citations (Scopus)


A pattern classification problem usually involves using high-dimensional features that make the classifier very complex and difficult to train. With no feature reduction, both training accuracy and generalization capability will suffer. This paper proposes a novel hybrid filter-wrapper-type feature subset selection methodology using a localized generalization error model. The localized generalization error model for a radial basis function neural network bounds from above the generalization error for unseen samples located within a neighborhood of the training samples. Iteratively, the feature making the smallest contribution to the generalization error bound is removed. Moreover, the novel feature selection method is independent of the sample size and is computationally fast. The experimental results show that the proposed method consistently removes large percentages of features with statistically insignificant loss of testing accuracy for unseen samples. In the experiments for two of the datasets, the classifiers built using feature subsets with 90% of features removed by our proposed approach yield average testing accuracies higher than those trained using the full set of features. Finally, we corroborate the efficacy of the model by using it to predict corporate bankruptcies in the US.
Original languageEnglish
Pages (from-to)3706-3719
Number of pages14
JournalPattern Recognition
Issue number12
Publication statusPublished - 1 Dec 2008


  • Feature selection
  • Generalization error
  • Neural network


Dive into the research topics of 'Feature selection using localized generalization error for supervised classification problems using RBFNN'. Together they form a unique fingerprint.

Cite this