Abstract
We present novel, efficient, model based kernels for time series data rooted in the reservoir computation framework. The kernels are implemented by fitting reservoir models sharing the same fixed deterministically constructed state transition part to individual time series. The proposed kernels can naturally handle time series of different length without the need to specify a parametric model class for the time series. Compared with most time series kernels, our kernels are computationally efficient. We show how the model distances used in the kernel can be calculated analytically or efficiently estimated. The experimental results on synthetic and benchmark time series classification tasks confirm the efficiency of the proposed kernel in terms of both generalization accuracy and computational speed. This paper also investigates on-line reservoir kernel construction for extremely long time series. Copyright © 2013 ACM.
Original language | English |
---|---|
Title of host publication | Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining |
Publisher | Association for Computing Machinery |
Pages | 392-400 |
Number of pages | 9 |
Volume | Part F128815 |
ISBN (Print) | 9781450321747 |
DOIs | |
Publication status | Published - 11 Aug 2013 |
Externally published | Yes |
Funding
This work was supported by the European Union Seventh Framework Programme under Grant Agreement INSFOICT-270428 on “Making Sense of Nonsense (iSense)”. The work of H. Chen was supported in part by the National Natural Science Foundation of China under Grants 61203292 and 61311130140, and the One Thousand Young Talents Program. The work of P. Tino was supported by the Biotechnology and Biological Sciences Research Council under Grant H012508/1. The work of X. Yao was supported by a Royal Society Wolfson Research Merit Award.
Keywords
- Kernel methods
- Reservoir computing
- Time series