Abstract
The EENCL algorithm [1] automatically designs neural network ensembles for classification, combining global evolution with local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. This paper analyses EENCL, finding that NCL is not an essential component of the algorithm, while implicit fitness sharing is. Furthermore, we find that a local search based on independent training is equally effective in both accuracy and diversity. We propose that NCL is unnecessary in EENCL for the tested datasets, and that complementary diversity in local search and global evolution may lead to better ensembles. © 2006 i6doc.com publication. All rights reserved.
Original language | English |
---|---|
Title of host publication | ESANN 2006 Proceedings - European Symposium on Artificial Neural Networks |
Pages | 431-436 |
Number of pages | 6 |
Publication status | Published - 2006 |
Externally published | Yes |
Funding
This work is partially funded by EPSRC and Thales Research & Technology (UK) Ltd.