Unsupervised Coding with LOCOCODE

Traditional approaches to sensory coding use code component-oriented objective functions (COCOFs) to evaluate code quality. Previous COCOFs do not take into account the information-theoretic complexity of the code-generating mapping itself. We do: “Low-complexity coding and decoding” (LOCOCODE) generates so-called lococodes that (1) convey information about the input data, (2) can be computed from the data by a low-complexity mapping (LCM), and (3) can be decoded by a LCM. We implement LococoDE by training autoassociators with Flat Minimum Search (FMS), a general method for finding lowcomplexity neural nets. LococoDE extracts optimal codes for difficult versions of the “bars” benchmark problem. As a preprocessor for a vowel recognition benchmark problem it sets the stage for excellent classification performance.

[1]  Jürgen Schmidhuber,et al.  Flat Minima , 1997, Neural Computation.

[2]  R. Zemel,et al.  Learning sparse multiple cause models , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[3]  David J. Field,et al.  Emergence of simple-cell receptive field properties by learning a sparse code for natural images , 1996, Nature.

[4]  David J. Field,et al.  What Is the Goal of Sensory Coding? , 1994, Neural Computation.

[5]  R. Tibshirani,et al.  Flexible Discriminant Analysis by Optimal Scoring , 1994 .

[6]  H. B. Barlow,et al.  Finding Minimum Entropy Codes , 1989, Neural Computation.

[7]  Jürgen Schmidhuber,et al.  Semilinear Predictability Minimization Produces Well-Known Feature Detectors , 1996, Neural Computation.

[8]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[9]  Geoffrey E. Hinton,et al.  Generative models for discovering sparse distributed representations. , 1997, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.