Detection error exponent for spatially dependent samples in random networks

The problem of binary hypothesis testing is considered when the measurements are drawn from a Markov random field (MRF) under each hypothesis. Spatial dependence of the measurements is incorporated by explicitly modeling the influence of sensor node locations on the clique potential functions of each MRF hypothesis. The nodes are placed i.i.d. in expanding areas with increasing sample size. Asymptotic performance of hypothesis testing is analyzed through the Neyman-Pearson type-II error exponent. The error exponent is expressed as the limit of a functional over dependency edges of the MRF hypotheses for acyclic graphs. Using the law of large numbers for graph functionals, the error exponent is derived.

[1]  Po-Ning Chen General formulas for the Neyman-Pearson type-II error exponent subject to fixed and exponential type-I error bounds , 1996, IEEE Trans. Inf. Theory.

[2]  H. Künsch Thermodynamics and statistical analysis of Gaussian random fields , 1981 .

[3]  Ananthram Swami,et al.  Optimal Node Density for Detection in Energy-Constrained Random Networks , 2008, IEEE Transactions on Signal Processing.

[4]  Ananthram Swami,et al.  Energy Scaling Laws for Distributed Inference in Random Networks , 2008, ArXiv.

[5]  Mathukumalli Vidyasagar Bounds on the kullback-leibler divergence rate between hidden markov models , 2007, 2007 46th IEEE Conference on Decision and Control.

[6]  H. Vincent Poor,et al.  An Introduction to Signal Detection and Estimation , 1994, Springer Texts in Electrical Engineering.

[7]  H. Vincent Poor,et al.  Information, energy and density for Ad Hoc sensor networks over correlated random fields: Large deviations analysis , 2008, 2008 IEEE International Symposium on Information Theory.

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  H. Vincent Poor,et al.  An introduction to signal detection and estimation (2nd ed.) , 1994 .

[10]  Te Sun Han Hypothesis testing with the general source , 2000, IEEE Trans. Inf. Theory.

[11]  Ian R. Petersen,et al.  Probabilistic distances between finite-state finite-alphabet hidden Markov models , 2005, IEEE Transactions on Automatic Control.

[12]  Fady Alajaji,et al.  The Kullback-Leibler divergence rate between Markov sources , 2004, IEEE Transactions on Information Theory.

[13]  Ananthram Swami,et al.  Energy scaling laws for distributed inference in random fusion networks , 2008, IEEE Journal on Selected Areas in Communications.

[14]  J. Yukich,et al.  Weak laws of large numbers in geometric probability , 2003 .

[15]  John Odentrantz,et al.  Markov Chains: Gibbs Fields, Monte Carlo Simulation, and Queues , 2000, Technometrics.

[16]  Hongkang Liang,et al.  A Novel Approach to Approximate Kullback-Leibler Distance Rate for Hidden Markov Models , 2005, Conference Record of the Thirty-Ninth Asilomar Conference onSignals, Systems and Computers, 2005..

[17]  Ananthram Swami,et al.  Detection of Gauss–Markov Random Fields With Nearest-Neighbor Dependency , 2007, IEEE Transactions on Information Theory.