Parameter estimation with multiterminal data compression

The multiterminal estimation theory deals with an entirely novel problem which takes place in the void between information theory and statistics, that is, what amount of Fisher information can be attained under a restriction on the amount of Shannon information. The key idea here is the indivisible fusion of the information-theoretic universal coding problem and the statistical maximum-likelihood parameter estimation problem. The main result is the explicit establishment of maximum-likelihood estimators attainable under the rate-constrained universal coding scheme, which is shown to have a variance equal to the inverse of the Fisher information. This may be regarded as giving a multiterminal generalization of the usual Cramer-Rao bound. Relevant properties and examples of these maximum-likelihood estimators are also shown.