Error Exponents for Target-Class Detection in a Sensor Network

We study the target class detection performance of a wireless sensor network with a structured node topology. The target is assumed to be in the far-field of the network and positioned at an angle ¿, which may be known or unknown. The target produces a random signal field that is spatially correlated and dependent on ¿ and the target's class i, i ¿ {0, 1}. We study the Neyman-Pearson detection error exponent for this scenario using large deviations theory. When ¿ is known, we derive a closed-form analytic expression for the probability of miss error exponent and show that it is monotonically decreasing in the node spacing d and bounded as d ¿ 0. When ¿ is unknown, we study its estimation using the Generalized Likelihood Ratio Test (GLRT). We study the error exponent of the GLRT using both analytic techniques and numerical simulations.