The Divergence and Bhattacharyya Distance Measures in Signal Selection

Minimization of the error probability to determine optimum signals is often difficult to carry out. Consequently, several suboptimum performance measures that are easier than the error probability to evaluate and manipulate have been studied. In this partly tutorial paper, we compare the properties of an often used measure, the divergence, with a new measure that we have called the Bhattacharyya distance. This new distance measure is often easier to evaluate than the divergence. In the problems we have worked, it gives results that are at least as good as, and are often better, than those given by the divergence.

[1]  H. Jeffreys,et al.  Theory of probability , 1896 .

[2]  M. L. Tildesley A FIRST STUDY OF THE BURMESE SKULL , 1921 .

[3]  P. Mahalanobis,et al.  Analysis of race-mixture in Bengal , 1925 .

[4]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[5]  P. Mahalanobis On the generalized distance in statistics , 1936 .

[6]  H. Jeffreys An invariant form for the prior probability in estimation problems , 1946, Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.

[7]  S. Kakutani On Equivalence of Infinite Product Measures , 1948 .

[8]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[9]  D. Blackwell Comparison of Experiments , 1951 .

[10]  H. Chernoff A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations , 1952 .

[11]  D. Blackwell Equivalent Comparisons of Experiments , 1953 .

[12]  R. N. Bradt On the Design and Comparison of Certain Dichotomous Experiments , 1954 .

[13]  K. Matusita Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and Estimation , 1955 .

[14]  C. Kraft Some conditions for consistency and uniform consistency of statistical procedures , 1955 .

[15]  Vasant Shankar Huzurbazar,et al.  Exact forms of some invariants for distributions admitting sufficient statistics , 1955 .

[16]  H. Chernoff LARGE-SAMPLE THEORY: PARAMETRIC CASE' , 1956 .

[17]  W. Hoeffding,et al.  Distinguishability of Sets of Distributions , 1958 .

[18]  Solomon Kullback,et al.  Information Theory and Statistics , 1960 .

[19]  Calyampudi R. Rao Asymptotic Efficiency and Limiting Information , 1961 .

[20]  J. Pierce Theoretical limitations on frequency and time diversity for fading binary transmissions , 1961, IRE Transactions on Communications Systems.

[21]  Thomas L. Grettenberg,et al.  Signal selection in communication and radar systems , 1963, IEEE Trans. Inf. Theory.

[22]  Thomas Marill,et al.  On the effectiveness of receptors in recognition systems , 1963, IEEE Trans. Inf. Theory.

[23]  B. Reiffen,et al.  An optimum demodulator for poisson processes: Photon source detectors , 1963 .

[24]  Larry A Shepp,et al.  Distinguishing a Sequence of Random Variables from a Translate of Itself , 1965 .

[25]  J. Hancock,et al.  A Transmitted Reference System for Communication in Random of Unknown Channels , 1965 .

[26]  Robert G. Gallager,et al.  A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.

[27]  Fred C. Schweppe On the distance between Gaussian processes: The state space approach , 1966 .

[28]  Kenneth Abend Optimum photon detection (Corresp.) , 1966, IEEE Trans. Inf. Theory.

[29]  Fred C. Schweppe,et al.  State Space Evaluation of the Bhattacharyya Distance between Two Gaussian Processes , 1967, Inf. Control..