Efficient Stochastic Source Coding and an Application to a Bayesian Network Source Model

In this paper, we introduce a new algorithm calledbits-back coding' that makes stochastic source codes ef"cient. For a given one-to-many source code, we show that this algorithm can actually be more ef"cient than the algorithm that always picks the shortest codeword. Optimal ef"ciency is achieved when codewords are chosen according to the Boltzmann distribution based on the codeword lengths. It turns out that a commonly used technique for determining parameters— maximum-likelihood estimation—actually minimizes the bits-back coding cost when codewords are chosen according to the Boltzmann distribution. A tractable approximation to maximum-likelihood estimation—the generalized expectation-maximization algorithm—minimizes the bits-back coding cost. After presenting a binary Bayesian network model that assigns exponentially many codewords to each symbol, we show how a tractable approximation to the Boltzmann distribution can be used for bits-back coding. We illustrate the performance of bits-back coding using non-synthetic data with a binary Bayesian network source model that produces 2 60 possible codewords for each input symbol. The rate for bits-back coding is nearly one half of that obtained by picking the shortest

[1]  David A. Huffman,et al.  A method for the construction of minimum-redundancy codes , 1952, Proceedings of the IRE.

[2]  C. Q. Lee,et al.  The Computer Journal , 1958, Nature.

[3]  L. Baum,et al.  Statistical Inference for Probabilistic Functions of Finite State Markov Chains , 1966 .

[4]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[5]  Glen G. Langdon,et al.  Arithmetic Coding , 1979, IBM J. Res. Dev..

[6]  Andrew J. Viterbi,et al.  Principles of Digital Communication and Coding , 1979 .

[7]  C. Thompson Classical Equilibrium Statistical Mechanics , 1988 .

[8]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[9]  C. S. Wallace,et al.  Classification by Minimum-Message-Length Inference , 1991, ICCI.

[10]  Geoffrey E. Hinton,et al.  Autoencoders, Minimum Description Length and Helmholtz Free Energy , 1993, NIPS.

[11]  Radford M. Neal A new view of the EM algorithm that justifies incremental and other variants , 1993 .

[12]  J. S. Vitter,et al.  Arithmetic coding for data compression : Data compression , 1994 .

[13]  Geoffrey E. Hinton,et al.  The Helmholtz Machine , 1995, Neural Computation.

[14]  Geoffrey E. Hinton,et al.  The "wake-sleep" algorithm for unsupervised neural networks. , 1995, Science.

[15]  Peter Dayan,et al.  A simple algorithm that discovers efficient perceptual codes , 1997 .

[16]  Brendan J. Frey,et al.  Bayesian networks for pattern classification, data compression, and channel coding , 1997 .

[17]  Ian H. Witten,et al.  Arithmetic coding revisited , 1998, TOIS.