Where academic tradition
meets the exciting future

On Learning and Cross-Validation with Decomposed Nyström Approximation of Kernel Matrix

Antti Airola, Tapio Pahikkala, Tapio Salakoski, On Learning and Cross-Validation with Decomposed Nyström Approximation of Kernel Matrix. Neural Processing Letters 33(1), 17–30, 2011.

Abstract:

The high computational costs of training kernel methods to solve nonlinear tasks limits
their applicability. However, recently several fast training methods have been introduced
for solving linear learning tasks. These can be used to solve nonlinear tasks by mapping
the input data nonlinearly to a low-dimensional feature space. In this work, we consider the
mapping induced by decomposing the Nyström approximation of the kernel matrix. We
collect together prior results and derive new ones to show how to efficiently train, make
predictions with and do cross-validation for reduced set approximations of learning
algorithms, given an efficient linear solver. Specifically, we present an efficient method for
removing basis vectors from the mapping, which we show to be important when performing
cross-validation.

http://dx.doi.org/10.1007/s11063-010-9159-4

BibTeX entry:

@ARTICLE{jAiPaSa11a,
  title = {On Learning and Cross-Validation with Decomposed Nyström Approximation of Kernel Matrix},
  author = {Airola, Antti and Pahikkala, Tapio and Salakoski, Tapio},
  journal = {Neural Processing Letters},
  volume = {33},
  number = {1},
  pages = {17–30},
  year = {2011},
  keywords = {Cross-validation, Empirical kernel map, Kernel methods, Nyström approximation, Reduced set method},
}

Belongs to TUCS Research Unit(s): Turku BioNLP Group

Publication Forum rating of this publication: level 1

Edit publication