Synchronous Boltzmann Machines and Gibbs Fields: Learning Algorithms

The Boltzmann machines are stochastic networks of formal neurons linked by a quadratic energy function. Hinton, Sejnowski and Ackley who introduced them as pattern classifiers that learn, have proposed a learning algorithm for the asynchronous machine. Here we study the synchronous machine where all neurons are simultaneously updated, we compute its equilibrium energy, and propose a synchronous learning algorithm based on delayed average coactivity of pairs of connected neurons. We generalize the Boltzmann machine paradigm to much wider types of interactions and energies allowing multiple interactions of arbitrary order. We propose a learning algorithm for these generalized machines using the theory of Gibbs fields and parameter estimation for such fields. We give quasi-convergence results for all these algorithms, within the framework of stochastic algorithms theory. The links between generalized Boltzmann machines and Markov field models sketched here provide the groundwork for designing generalized Boltzmann machines capable of performing efficient low level vision tasks. These Boltzmann vision modules are described in a forthcoming paper.