Abstract
Class imbalance learning is an important research area in machine learning, where instances in some classes heavily outnumber the instances in other classes. This unbalanced class distribution causes performance degradation. Some ensemble solutions have been proposed for the class imbalance problem. Diversity has been proved to be an influential aspect in ensemble learning, which describes the degree of different decisions made by classifiers. However, none of those proposed solutions explore the impact of diversity on imbalanced data sets. In addition, most of them are based on re-sampling techniques to rebalance class distribution, and over-sampling usually causes overfitting (high generalisation error). This paper investigates if diversity can relieve this problem by using negative correlation learning (NCL) model, which encourages diversity explicitly by adding a penalty term in the error function of neural networks. A variation model of NCL is also proposed mdash; NCLCost. Our study shows that diversity has a direct impact on the measure of recall. It is also a factor that causes the reduction of F-measure. In addition, although NCL-based models with extreme settings do not produce better recall values of minority class than SMOTEBoost [1], they have slightly better performance of F-measure and G-mean than both independent ANNs and SMOTEBoost and better recall than independent ANNs. © 2009 IEEE.
Original language | English |
---|---|
Title of host publication | Proceedings of the International Joint Conference on Neural Networks |
Pages | 3259-3266 |
Number of pages | 8 |
DOIs | |
Publication status | Published - Jun 2009 |
Externally published | Yes |