On-line bagging negative correlation learning

Fernanda L. MINKU, Xin YAO

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

3 Citations (Scopus)


Negative Correlation Learning (NCL) has been showing to outperform other ensemble learning approaches in off-line mode. A key point to the success of NCL is that the learning of an ensemble member is influenced by the learning of the others, directly encouraging diversity. However, when applied to on-line learning, NCL presents the problem that part of the diversity has to be built a priori, as the same sequence of training data is sent to all the ensemble members. In this way, the choice of the base models to be used is limited and the use of more adequate neural network models for the problem to be solved may be not possible. This paper proposes a new method to perform on-line learning based on NCL and On-line Bagging. The method directly encourages diversity, as NCL, but sends a different sequence of training data to each one of the base models in an on-line bagging way. So, it allows the use of deterministic base models such as Evolving Fuzzy Neural Networks (EFuNNs), which are specifically designed to perform on-line learning. Experiments show that on-line bagging NCL using EFuNNs have better accuracy than NCL applied to online learning using on-line Multi-Layer Perceptrons (MLPs) in 4 out of S classification databases. Besides, on-line bagging NCL using EFuNNs manage to attain similar accuracy to NCL using off-line MLPs. © 2008 IEEE.
Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks
Number of pages8
Publication statusPublished - Jun 2008
Externally publishedYes


Dive into the research topics of 'On-line bagging negative correlation learning'. Together they form a unique fingerprint.

Cite this