You are here: TUCS > PUBLICATIONS > Publication Search > Efficient Cross-Validation for...
Efficient Cross-Validation for Kernelized Least-Squares Regression with Sparse Basis Expansions
Tapio Pahikkala, Hanna Suominen, Jorma Boberg, Efficient Cross-Validation for Kernelized Least-Squares Regression with Sparse Basis Expansions. Machine Learning 87(3), 381–407, 2012.
http://dx.doi.org/10.1007/s10994-012-5287-6
Abstract:
We propose an efficient algorithm for calculating hold-out and cross-validation (CV) type of estimates for sparse regularized least-squares predictors. Holding out H data points with our method requires O(min(H^2n,Hn^2)) time provided that a predictor with n basis vectors is already trained. In addition to holding out training examples, also some of the basis vectors used to train the sparse regularized least-squares predictor with the whole training set can be removed from the basis vector set used in the hold-out computation. In our experiments, we demonstrate the speed improvements provided by our algorithm in practise, and we empirically show the benefits of removing some of the basis vectors during the CV rounds.
BibTeX entry:
@ARTICLE{jPaSuBo12a,
title = {Efficient Cross-Validation for Kernelized Least-Squares Regression with Sparse Basis Expansions},
author = {Pahikkala, Tapio and Suominen, Hanna and Boberg, Jorma},
journal = {Machine Learning},
volume = {87},
number = {3},
publisher = {Springer},
pages = {381–407},
year = {2012},
keywords = {Hold-out, Cross-validation, Regularized least-squares, Least-squares support vector machine, Kernel methods, Sparse basis expansions},
}
Belongs to TUCS Research Unit(s): Algorithmics and Computational Intelligence Group (ACI), Turku BioNLP Group
Publication Forum rating of this publication: level 3