We present a general formulation for a network of stochastic directional units. This formulation is an extension of the Boltzmann machine in which the units are not binary, but take on values in a cyclic range, between 0 and 2π radians. The state of each unit in a Directional-Unit Boltzmann Machine (DUBM) is described by a complex variable, where the phase component specifies a direction; the weights are also complex variables. We associate a quadratic energy function, and corresponding probability, with each DUBM configuration. The conditional distribution of a unit's stochastic state is a circular version of the Gaussian probability distribution, known as the von Mises distribution. In a mean-field approximation to a stochastic DUBM, the phase component of a unit's state represents its mean direction, and the magnitude component specifies the degree of certainty associated with this direction. This combination of a value and a certainty provides additional representational power in a unit. We describe a learning algorithm and simulations that demonstrate a mean-field DUBM'S ability to learn interesting mappings.
[1]
Geoffrey E. Hinton.
Deterministic Boltzmann Learning Performs Steepest Descent in Weight-Space
,
1989,
Neural Computation.
[2]
Pierre Baldi,et al.
Computing with Arrays of Coupled Oscillators: An Application to Preattentive Texture Discrimination
,
1990,
Neural Computation.
[3]
André J. Noest,et al.
Phasor Neural Networks
,
1987,
NIPS.
[4]
K. Mardia.
Statistics of Directional Data
,
1972
.
[5]
Richard S. Zemel,et al.
Learning to Segment Images Using Dynamic Feature Binding
,
1991,
Neural Computation.
[6]
B. Huberman,et al.
Gauge symmetries in random magnetic systems
,
1978
.
[7]
Carsten Peterson,et al.
Rotor Neurons: Basic Formalism and Dynamics
,
1992,
Neural Computation.
[8]
Carsten Peterson,et al.
A Mean Field Theory Learning Algorithm for Neural Networks
,
1987,
Complex Syst..
[9]
Geoffrey E. Hinton,et al.
A Learning Algorithm for Boltzmann Machines
,
1985,
Cogn. Sci..