Abstract
We present a novel model-metric co-learning (MMCL) methodology for sequence classification which learns in the model space - each data item (sequence) is represented by a predictive model from a carefully designed model class. MMCL learning encourages sequences from the same class to be represented by 'close' model representations, well separated from those for different classes. Existing approaches to the problem either fit a single model to all the data, or a (predominantly linear) model on each sequence. We introduce a novel hybrid approach spanning the two extremes. The model class we use is a special form of adaptive high-dimensional non-linear state space model with a highly constrained and simple dynamic part. The dynamic part is identical for all data items and acts as a temporal filter providing a rich pool of dynamic features that can be selectively extracted by individual (static) linear readout mappings representing the sequences. Alongside learning the dynamic part, we also learn the global metric in the model readout space. Experiments on synthetic and benchmark data sets confirm the effectiveness of the algorithm compared to a variety of alternative methods.
Original language | English |
---|---|
Title of host publication | IJCAI International Joint Conference on Artificial Intelligence |
Publisher | International Joint Conferences on Artificial Intelligence |
Pages | 3387-3394 |
Number of pages | 8 |
Volume | 2015-January |
ISBN (Print) | 9781577357384 |
Publication status | Published - 2015 |
Externally published | Yes |