Abstract
This paper proposes a general boosting framework for combining multiple kernel models in the context of both classification and regression problems. Our main approach is built on the idea of gradient boosting together with a new regularization scheme and aims at reducing the cubic complexity of training kernel models. We focus mainly on using the proposed boosting framework to combine kernel ridge regression (KRR) models for regression tasks. Numerical experiments on four large-scale data sets have shown that boosting multiple small KRR models is superior to training a single large KRR model on both improving generalization performance and reducing computational requirements. © 2006 IEEE.
Original language | English |
---|---|
Title of host publication | Proceedings - IEEE International Conference on Data Mining, ICDM |
Pages | 583-591 |
Number of pages | 9 |
DOIs | |
Publication status | Published - Dec 2006 |
Externally published | Yes |