Step Size-Adapted Online Support Vector Learning
A. Karatzoglou, S. Vishwanathan, N. N. Schraudolph, and A. J. Smola. Step Size-Adapted Online Support Vector Learning. In Proc. 8th Intl. Symp. Signal Processing & Applications, IEEE, 2005.
Download
207.5kB | 60.8kB | 589.4kB |
Abstract
We present an online Support Vector Machine (SVM) that uses Stochastic Meta-Descent (SMD) to adapt its step size automatically. We formulate the online learning problem as a stochastic gradient descent in Reproducing Kernel Hilbert Space (RKHS) and translate SMD to the nonparametric setting, where its gradient trace parameter is no longer a coefficient vector but an element of the RKHS. We derive efficient updates that allow us to perform the step size adaptation in linear time. We apply the online SVM framework to a variety of loss functions and in particular show how to achieve efficient online multiclass classification. Experimental evidence suggests that our algorithm outperforms existing methods.
BibTeX Entry
@inproceedings{KarVisSchSmo05, author = {Alexandros Karatzoglou and S.~V.~N. Vishwanathan and Nicol N. Schraudolph and Alex J. Smola}, title = {\href{http://nic.schraudolph.org/pubs/KarVisSchSmo05.pdf}{ Step Size-Adapted Online Support Vector Learning}}, booktitle = {Proc.\ 8$^{th}$ Intl.\ Symp.\ Signal Processing \& Applications}, publisher = {IEEE}, year = 2005, b2h_type = {Other}, b2h_topic = {>Stochastic Meta-Descent, Kernel Methods}, abstract = { We present an online Support Vector Machine (SVM) that uses Stochastic Meta-Descent (SMD) to adapt its step size automatically. We formulate the online learning problem as a stochastic gradient descent in Reproducing Kernel Hilbert Space (RKHS) and translate SMD to the nonparametric setting, where its gradient trace parameter is no longer a coefficient vector but an element of the RKHS. We derive efficient updates that allow us to perform the step size adaptation in linear time. We apply the online SVM framework to a variety of loss functions and in particular show how to achieve efficient online multiclass classification. Experimental evidence suggests that our algorithm outperforms existing methods. }}