Diversity exploration and negative correlation learning on imbalanced data sets


Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

23 Citations (Scopus)


Class imbalance learning is an important research area in machine learning, where instances in some classes heavily outnumber the instances in other classes. This unbalanced class distribution causes performance degradation. Some ensemble solutions have been proposed for the class imbalance problem. Diversity has been proved to be an influential aspect in ensemble learning, which describes the degree of different decisions made by classifiers. However, none of those proposed solutions explore the impact of diversity on imbalanced data sets. In addition, most of them are based on re-sampling techniques to rebalance class distribution, and over-sampling usually causes overfitting (high generalisation error). This paper investigates if diversity can relieve this problem by using negative correlation learning (NCL) model, which encourages diversity explicitly by adding a penalty term in the error function of neural networks. A variation model of NCL is also proposed mdash; NCLCost. Our study shows that diversity has a direct impact on the measure of recall. It is also a factor that causes the reduction of F-measure. In addition, although NCL-based models with extreme settings do not produce better recall values of minority class than SMOTEBoost [1], they have slightly better performance of F-measure and G-mean than both independent ANNs and SMOTEBoost and better recall than independent ANNs. © 2009 IEEE.
Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks
Number of pages8
Publication statusPublished - Jun 2009
Externally publishedYes


Dive into the research topics of 'Diversity exploration and negative correlation learning on imbalanced data sets'. Together they form a unique fingerprint.

Cite this