This paper considers the generalized version of relevance vector machine (RVM), which is a sparse Bayesian kernel machine for classification and ordinary regression. Generalized RVM (GRVM) follows the work of generalized linear model (GLM), which is a natural generalization of ordinary linear regression model and shares a common approach to estimate the parameters. GRVM inherits the advantages of GLM, i.e., unified model structure, same training algorithm, and convenient task-specific model design. It also inherits the advantages of RVM, i.e., probabilistic output, extremely sparse solution, hyperparameter auto-estimation. Besides, GRVM extends RVM to a wider range of learning tasks beyond classification and ordinary regression by assuming that the conditional output belongs to exponential family distribution (EFD). Since EFD results in inference intractable problem in Bayesian analysis, in this paper, Laplace approximation is adopted to solve this problem, which is a common approach in Bayesian inference. Further, several task-specific models are designed based on GRVM including models for ordinary regression, count data regression, classification, ordinal regression, etc. Besides, the relationship between GRVM and traditional RVM models are discussed. Finally, experimental results show the efficiency of the proposed GRVM model.
|Title of host publication||2017 Intelligent Systems Conference, IntelliSys 2017|
|Publication status||Published - Sept 2017|
Bibliographical noteThis work was supported in part by the National Natural Science Foundation of China under Grants 61672443 and 61402460 and in part by Hong Kong RGC General Research Fund 9042322 (CityU 11200116).
- Bayesian analysis
- exponential family distribution
- generalized linear models
- Laplace approximation
- Relevance vector machine