Abstract
Negative correlation learning (NCL) is a successful scheme for constructing neural network ensembles. In batch learning mode, NCL outperforms many other ensemble learning approaches. Recently, NCL is also shown to be a potentially powerful approach to incremental learning, while the advantage of NCL has not yet been fully exploited. In this paper, we propose a selective NCL approach for incremental learning. In the proposed approach, the previously trained ensemble is cloned when a new data set presents and the cloned ensemble is trained on the new data set. Then, the new ensemble is combined with the previous ensemble and a selection process is applied to prune the whole ensemble to a fixedsize. Simulation results on several benchmark datasets show that the proposed algorithm outperforms two recent incremental learning algorithms based on NCL. © 2008 IEEE.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the International Joint Conference on Neural Networks |
| Pages | 2525-2530 |
| Number of pages | 6 |
| DOIs | |
| Publication status | Published - Jun 2008 |
| Externally published | Yes |
Fingerprint
Dive into the research topics of 'Selective negative correlation learning algorithm for incremental learning'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver