Abstract
Negative correlation learning (NCL) is a technique that attempts to create an ensemble of neural networks whose outputs are accurate but negatively correlated. The motivation for such a technique can be found in the bias-variance-covariance decomposition of an ensemble of learner's generalization error. NCL is also increasingly used in conjunction with an evolutionary process, which gives rise to the possibility of adapting the structures of the networks at the same time as learning the weights. This chapter examines the motivation and characteristics of the NCL algorithm. Some recent work relating to the implementation of NCL in a single objective evolutionary framework for classification tasks is presented, and we examine the impact of two speciation techniques: implicit fitness sharing and an island model population structure. The choice of such speciation techniques can have a detrimental effect on the ability of NCL to produce accurate and diverse ensembles and should therefore be chosen carefully. This chapter also provides an overview of other researchers' work with NCL and gives some promising future research directions. © 2008, IGI Global.
Original language | English |
---|---|
Title of host publication | Pattern Recognition Technologies and Applications: Recent Advances |
Editors | Brijesh VERMA, Michael BLUMENSTEIN |
Publisher | IGI Global |
Chapter | 16 |
Pages | 344-369 |
Number of pages | 26 |
ISBN (Electronic) | 9781599048093 |
ISBN (Print) | 9781599048079 |
DOIs | |
Publication status | Published - 2008 |
Externally published | Yes |