Negative correlation learning for classification ensembles

Shuo WANG, Huanhuan CHEN, Xin YAO

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

81 Citations (Scopus)


This paper proposes a new negative correlation learning (NCL) algorithm, called AdaBoost.NC, which uses an ambiguity term derived theoretically for classification ensembles to introduce diversity explicitly. All existing NCL algorithms, such as CELS [1] and NCCD [2], and their theoretical backgrounds were studied in the regression context. We focus on classification problems in this paper. First, we study the ambiguity decomposition with the 0-1 error function, which is different from the one proposed by Krogh et al. [3]. It is applicable to both binary-class and multi-class problems. Then, to overcome the identified drawbacks of the existing algorithms, AdaBoost.NC is proposed by exploiting the ambiguity term in the decomposition to improve diversity. Comprehensive experiments are performed on a collection of benchmark data sets. The results show AdaBoost.NC is a promising algorithm to solve classification problems, which gives better performance than the standard AdaBoost and NCCD, and consumes much less computation time than CELS. © 2010 IEEE.
Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Print)9781424469178
Publication statusPublished - Jul 2010
Externally publishedYes


Dive into the research topics of 'Negative correlation learning for classification ensembles'. Together they form a unique fingerprint.

Cite this