Accelerated Training of Conditional Random Fields with Stochastic Gradient Methods
S. Vishwanathan, N. N. Schraudolph, M. W. Schmidt, and K. Murphy. Accelerated Training of Conditional Random Fields with Stochastic Gradient Methods. In Proc. 23rd Intl. Conf. Machine Learning (ICML), pp. 969–976, ACM Press, 2006.
Download
1.1MB | 185.3kB | 1.4MB |
Abstract
We apply Stochastic Meta-Descent (SMD), a stochastic gradient optimization method with gain vector adaptation, to the training of Conditional Random Fields (CRFs). On several large data sets, the resulting optimizer converges to the same quality of solution over an order of magnitude faster than limited-memory BFGS, the leading method reported to date. We report results for both exact and inexact inference techniques.
BibTeX Entry
@inproceedings{VisSchSchMur06, author = {S.~V.~N. Vishwanathan and Nicol N. Schraudolph and Mark W. Schmidt and Kevin Murphy}, title = {\href{http://nic.schraudolph.org/pubs/VisSchSchMur06.pdf}{ Accelerated Training of Conditional Random Fields with Stochastic Gradient Methods}}, pages = {969--976}, editor = {William Cohen and Andrew W. Moore}, booktitle = {Proc.\ 23$^{rd}$ Intl.\ Conf.\ Machine Learning (ICML)}, publisher = {ACM Press}, year = 2006, b2h_type = {Top Conferences}, b2h_topic = {>Stochastic Meta-Descent}, abstract = { We apply Stochastic Meta-Descent (SMD), a stochastic gradient optimization method with gain vector adaptation, to the training of Conditional Random Fields (CRFs). On several large data sets, the resulting optimizer converges to the same quality of solution over an order of magnitude faster than limited-memory BFGS, the leading method reported to date. We report results for both exact and inexact inference techniques. }}