Abstract
The EENCL algorithm [1] has been proposed as a method for designing neural network ensembles for classification tasks, combining global evolution with a local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. In order to better understand the success of EENCL, this work replaces speciation by fitness sharing with an island model population structure. We find that providing a population structure that allows for diversity to emerge, rather than enforcing diversity through a similarity penalty in the fitness evaluation, we are able to produce more accurate ensembles, since a more diverse population does not necessarily lead to a more accurate ensemble. © 2006 IEEE.
Original language | English |
---|---|
Title of host publication | 2006 IEEE Congress on Evolutionary Computation, CEC 2006 |
Pages | 3317-3321 |
Number of pages | 5 |
Publication status | Published - 2006 |
Externally published | Yes |