Abstract
This paper presents a new algorithm for designing neural network ensembles for classification problems with noise. The idea behind this new algorithm is to encourage different individual networks in an ensemble to learn different parts or aspects of the training data so that the whole ensemble can learn the whole training data better. Negatively correlated neural networks are trained with a novel correlation penalty term in the error function to encourage such specialization. In our algorithm, individual networks are trained simultaneously rather than independently or sequentially. This provides an opportunity for different networks to interact with each other and to specialize. Experiments on two real-world problems demonstrate that the new algorithm can produce neural network ensembles with good generalization ability.
Original language | English |
---|---|
Pages (from-to) | 255-259 |
Number of pages | 5 |
Journal | Artificial Life and Robotics |
Volume | 3 |
Issue number | 4 |
DOIs | |
Publication status | Published - Dec 1999 |
Externally published | Yes |
Keywords
- Neural network ensembles
- Learning
- Generalization
- Bias
- Variance