A central problem in connectionist modelling is the control of network and architectural resources during learning. In the present approach, weights reflect a coarse prediction history as coded by a distribution of values and parameterized in the mean and standard deviation of these weight distributions. Weight updates are a function of both the mean and standard deviation of each connection in the network and vary as a function of the error signal ("stochastic delta rule"; Hanson, 1990). Consequently, the weights maintain information on their central tendency and their "uncertainty" in prediction. Such information is useful in establishing a policy concerning the size of the nodal complexity of the network and growth of new nodes. For example, during problem solving the present network can undergo "meiosis", producing two nodes where there was one "overtaxed" node as measured by its coefficient of variation. It is shown in a number of benchmark problems that meiosis networks can find minimal architectures, reduce computational complexity, and overall increase the efficiency of the representation learning interaction.
[1]
Thomas M. Cover,et al.
Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition
,
1965,
IEEE Trans. Electron. Comput..
[2]
Benedict Delisle Burns,et al.
The uncertain nervous system
,
1968
.
[3]
G. J. Tomko,et al.
Neuronal variability: non-stationary responses to identical visual stimuli.
,
1974,
Brain research.
[4]
C. D. Gelatt,et al.
Optimization by Simulated Annealing
,
1983,
Science.
[5]
Stephen Jose Hanson,et al.
Minkowski-r Back-Propagation: Learning in Connectionist Models with Non-Euclidian Error Signals
,
1987,
NIPS.
[6]
Lorien Y. Pratt,et al.
Comparing Biases for Minimal Network Construction with Back-Propagation
,
1988,
NIPS.
[7]
Stephen José Hanson,et al.
A stochastic version of the delta rule
,
1990
.