Abstract
An ensemble is a group of learners that work together as a committee to solve a problem. However, the existing ensemble training algorithms sometimes generate unnecessary large ensembles, which consume extra computational resource and may degrade the performance. Ensemble pruning algorithm aims to find a good subset of ensemble members to constitute a small ensemble, which saves the computational resource and performs as well as, or better than, the non-pruned ensemble. This paper will introduce a probabilistic ensemble pruning algorithm by choosing a set of "sparse" combination weights, most of which are zero, to prune the large ensemble. In order to obtain the set of sparse combination weights and satisfy the non-negative restriction of the combination weights, a left-truncated, nonnegative, Gaussian prior is adopted over every combination weight. Expectation-Maximization algorithm is employed to obtain maximum a posterior (MAP) estimation of weight vector. Four benchmark regression problems and another four benchmark classification problems have been employed to demonstrate the effectiveness of the method. © 2006 IEEE.
Original language | English |
---|---|
Title of host publication | Proceedings - IEEE International Conference on Data Mining, ICDM |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 878-882 |
Number of pages | 5 |
ISBN (Print) | 9780769527024 |
DOIs | |
Publication status | Published - 2006 |
Externally published | Yes |