Competitive Anti-Hebbian Learning of Invariants
N. N. Schraudolph and T.
J. Sejnowski. Competitive Anti-Hebbian Learning of Invariants.
In Advances in Neural Information Processing
Systems (NIPS), pp. 1017–1024, Morgan Kaufmann, San Mateo, CA, 1992.
In Ph.D. thesis
Download
361.0kB | 93.5kB | 246.6kB |
Abstract
Although the detection of invariant input structure is vital to many recognition tasks, connectionist learning rules tend to focus on directions of high variance (principal components). The prediction paradigm can be used to reconcile this dichotomy; here we offer a more direct, unsupervised approach to invariant learning based on an anti-Hebbian learning rule, and demonstrate its success in extracting coherent depth information from random stereograms.
BibTeX Entry
@inproceedings{SchSej92, author = {Nicol N. Schraudolph and Terrence J. Sejnowski}, title = {\href{http://nic.schraudolph.org/pubs/SchSej92.pdf}{ Competitive Anti-{H}ebbian Learning of Invariants}}, pages = {1017--1024}, editor = {John E. Moody and Steven J. Hanson and Richard P. Lippmann}, booktitle = nips, publisher = {Morgan Kaufmann, San Mateo, CA}, volume = 4, year = 1992, b2h_type = {Top Conferences}, b2h_topic = {>Competitive Learning}, b2h_note = {In <a href="b2hd-Schraudolph95">Ph.D. thesis</a>}, abstract = { Although the detection of {\em invariant}\/ input structure is vital to many recognition tasks, connectionist learning rules tend to focus on directions of high variance (principal components). The prediction paradigm can be used to reconcile this dichotomy; here we offer a more direct, unsupervised approach to invariant learning based on an {\em anti-Hebbian}\/ learning rule, and demonstrate its success in extracting coherent depth information from random stereograms. }}