A gradient-based forward greedy algorithm for space Gaussian process regression

Ping SUN, Xin YAO

Research output: Book Chapters | Papers in Conference ProceedingsBook ChapterResearchpeer-review

1 Citation (Scopus)


In this chapter, we present a gradient-based forward greedy method for sparse approximation of Bayesian Gaussian Process Regression (GPR) model. Different from previous work, which is mostly based on various basis vector selection strategies, we propose to construct instead of select a new basis vector at each iterative step. This idea was motivated from the well-known gradient boosting approach. The resulting algorithm built on gradient-based optimisation packages incurs similar computational cost and memory requirements to other leading sparse GPR algorithms. Moreover, the proposed work is a general framework which can be extended to deal with other popular kernel machines, including Kernel Logistic Regression (KLR) and Support Vector Machines (SVMs). Numerical experiments on a wide range of datasets are presented to demonstrate the superiority of our algorithm in terms of generalisation performance.
Original languageEnglish
Title of host publicationTrends in Neural Computation
EditorsKe CHEN, Lipo WANG
Number of pages23
ISBN (Electronic)9783540361220
ISBN (Print)9783540361213
Publication statusPublished - 2007
Externally publishedYes

Publication series

NameStudies in Computational Intelligence
ISSN (Print)1860-949X
ISSN (Electronic)1860-9503


  • Gaussian process regression
  • sparse approximation
  • sequential forward greedy algorithm
  • basis vector selection
  • basis vector construction
  • gradient-based optimisation
  • gradient boosting


Dive into the research topics of 'A gradient-based forward greedy algorithm for space Gaussian process regression'. Together they form a unique fingerprint.

Cite this