Negative correlation ensemble learning for ordinal regression

Francisco FERNÁNDEZ-NAVARRO, Pedro Antonio GUTIÉRREZ, Casar HERVÁS-MARTÍNEZ, Xin YAO

Research output: Journal PublicationsJournal Article (refereed)peer-review

33 Citations (Scopus)

Abstract

In this paper, two neural network threshold ensemble models are proposed for ordinal regression problems. For the first ensemble method, the thresholds are fixed a priori and are not modified during training. The second one considers the thresholds of each member of the ensemble as free parameters, allowing their modification during the training process. This is achieved through a reformulation of these tunable thresholds, which avoids the constraints they must fulfill for the ordinal regression problem. During training, diversity exists in different projections generated by each member is taken into account for the parameter updating. This diversity is promoted in an explicit way using a diversity-encouraging error function, extending the well-known negative correlation learning framework to the area of ordinal regression, and inheriting many of its good properties. Experimental results demonstrate that the proposed algorithms can achieve competitive generalization performance when considering four ordinal regression metrics. © 2012 IEEE.
Original languageEnglish
Article number6548028
Pages (from-to)1836-1849
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume24
Issue number11
Early online date27 Jun 2013
DOIs
Publication statusPublished - Nov 2013
Externally publishedYes

Keywords

  • Negative correlation learning (NCL)
  • neural network ensembles
  • ordinal regression
  • threshold methods

Fingerprint

Dive into the research topics of 'Negative correlation ensemble learning for ordinal regression'. Together they form a unique fingerprint.

Cite this