To determine whether a particular sensory event is a reliable predictor of reward or punishment it is necessary to know the prior probability of that event. If the variables of a sensory representation normally occur independently of each other, then it is possible to derive the prior probability of any logical function of the variables from the prior probabilities of the individual variables, without any additional knowledge; hence such a representation enormously enlarges the scope of definable events that can be searched for reliable predictors. Finding a Minimum Entropy Code is a possible method of forming such a representation, and methods for doing this are explored in this paper. The main results are (1) to show how to find such a code when the probabilities of the input states form a geometric progression, as is shown to be nearly true for keyboard characters in normal text; (2) to show how a Minimum Entropy Code can be approximated by repeatedly recoding pairs, triples, etc. of an original 7-bit code for keyboard characters; (3) to prove that in some cases enlarging the capacity of the output channel can lower the entropy.
[1]
F. Attneave.
Some informational aspects of visual perception.
,
1954,
Psychological review.
[2]
H. Kucera,et al.
Computational analysis of present-day American English
,
1967
.
[3]
D. S. Jones,et al.
Elementary information theory
,
1979
.
[4]
Satosi Watanabe,et al.
Pattern recognition as a quest for minimum entropy
,
1981,
Pattern Recognit..
[5]
Satosi Watanabe,et al.
Pattern Recognition: Human and Mechanical
,
1985
.
[6]
Peter Földiák,et al.
Adaptation and decorrelation in the cortex
,
1989
.
[7]
H. Barlow.
Conditions for versatile learning, Helmholtz's unconscious inference, and the task of perception
,
1990,
Vision Research.