Empirical Entropy Manipulation for Real-World Problems

P. A. Viola, N. N. Schraudolph, and T. J. Sejnowski. Empirical Entropy Manipulation for Real-World Problems. In Advances in Neural Information Processing Systems (NIPS), pp. 851–857, The MIT Press, Cambridge, MA, 1996.
In Ph.D. thesis     Latest version

Download

pdf djvu ps.gz
224.9kB   101.6kB   88.5kB  

Abstract

No finite sample is sufficient to determine the density, and therefore the entropy, of a signal directly. Some assumption about either the functional form of the density or about its smoothness is necessary. Both amount to a prior over the space of possible density functions. By far the most common approach is to assume that the density has a parametric form. By contrast we derive a differential learning rule called EMMA that optimizes entropy by way of kernel density estimation. Entropy and its derivative can then be calculated by sampling from this density estimate. The resulting parameter update rule is surprisingly simple and efficient. We will show how EMMA can be used to detect and correct corruption in magnetic resonance images (MRI). This application is beyond the scope of existing parametric entropy models.

BibTeX Entry

@inproceedings{VioSchSej96,
     author = {Paul A. Viola and Nicol N. Schraudolph
               and Terrence J. Sejnowski},
      title = {\href{http://nic.schraudolph.org/pubs/VioSchSej96.pdf}{
               Empirical Entropy Manipulation for Real-World Problems}},
      pages = {851--857},
     editor = {David S. Touretzky and Michael C. Mozer and Michael E. Hasselmo},
  booktitle =  nips,
  publisher = {The {MIT} Press, Cambridge, MA},
     volume =  8,
       year =  1996,
   b2h_type = {Top Conferences},
  b2h_topic = {Computer Vision, >Entropy Optimization},
   b2h_note = {In <a href="b2hd-Schraudolph95">Ph.D. thesis</a> &nbsp;&nbsp;&nbsp; <a href="b2hd-Schraudolph04">Latest version</a>},
   abstract = {
    No finite sample is sufficient to determine the density, and therefore
    the entropy, of a signal directly.  Some assumption about either the
    functional form of the density or about its smoothness is necessary.
    Both amount to a prior over the space of possible density functions.
    By far the most common approach is to assume that the density has a
    parametric form.
    By contrast we derive a differential learning rule called EMMA that
    optimizes entropy by way of kernel density estimation.  Entropy and
    its derivative can then be calculated by sampling from this density
    estimate.  The resulting parameter update rule is surprisingly simple
    and efficient.
    We will show how EMMA can be used to detect and correct corruption in
    magnetic resonance images (MRI). This application is beyond the scope
    of existing parametric entropy models.
}}

Generated by bib2html.pl (written by Patrick Riley) on Thu Sep 25, 2014 12:00:33