Implementing negative correlation learning in evolutionary ensembles with suitable speciation techniques

Peter DUELL, Xin YAO

Research output: Book Chapters | Papers in Conference ProceedingsBook ChapterResearchpeer-review


Negative correlation learning (NCL) is a technique that attempts to create an ensemble of neural networks whose outputs are accurate but negatively correlated. The motivation for such a technique can be found in the bias-variance-covariance decomposition of an ensemble of learner's generalization error. NCL is also increasingly used in conjunction with an evolutionary process, which gives rise to the possibility of adapting the structures of the networks at the same time as learning the weights. This chapter examines the motivation and characteristics of the NCL algorithm. Some recent work relating to the implementation of NCL in a single objective evolutionary framework for classification tasks is presented, and we examine the impact of two speciation techniques: implicit fitness sharing and an island model population structure. The choice of such speciation techniques can have a detrimental effect on the ability of NCL to produce accurate and diverse ensembles and should therefore be chosen carefully. This chapter also provides an overview of other researchers' work with NCL and gives some promising future research directions. © 2008, IGI Global.
Original languageEnglish
Title of host publicationPattern Recognition Technologies and Applications: Recent Advances
EditorsBrijesh VERMA, Michael BLUMENSTEIN
PublisherIGI Global
Number of pages26
ISBN (Electronic)9781599048093
ISBN (Print)9781599048079
Publication statusPublished - 2008
Externally publishedYes


Dive into the research topics of 'Implementing negative correlation learning in evolutionary ensembles with suitable speciation techniques'. Together they form a unique fingerprint.

Cite this