SummaryThis paper is an account of a new method of constructing measures of divergence between probability measures; the new divergence measures so constructed are called information radius measures. They are information-theoretic in character, and are based on the work of Rényi  and Csiszár [2, 3]. The divergence measure K1 can be used for the measurement of dissimilarity in numerical taxonomy, and its application to this field is discussed in Jardine and Sibson ; it was this application which originally motivated the study of information radius. Other forms of information radius are related to the variation distance, and the normal information radius discussed in § 3 is related to Mahalanobis’ D2 Statistic. This paper is in part intended to lay the mathematical foundations for , but because information radius appears to be of some general interest, the investigation of its properties is here carried further than is needed for the applications discussed in .