The Edit Distance as a Measure of Perceived Rhythmic Similarity

The 'edit distance' (or 'Levenshtein distance') measure of distance between two data sets is defined as the minimum number of editing operations - insertions, deletions, and substitutions - that are required to transform one data set to the other (Orpen and Huron, 1992). This measure of distance has been applied frequently and successfully in music information retrieval, but rarely in predicting human perception of distance. In this study, we investigate the effectiveness of the edit distance as a predictor of perceived rhythmic dissimilarity under simple rhythmic alterations. Approaching rhythms as a set of pulses that are either onsets or silences, we study two types of alterations. The first experiment is designed to test the model's accuracy for rhythms that are relatively similar; whether rhythmic variations with the same edit distance to a source rhythm are also perceived as relatively similar by human subjects. In addition, we observe whether the salience of an edit operation is affected by its metric placement in the rhythm. Instead of using a rhythm that regularly subdivides a 4/4 meter, our source rhythm is a syncopated 16-pulse rhythm, the son. Results show a high correlation between the predictions by the edit distance model and human similarity judgments (r = 0.87); a higher correlation than for the well-known generative theory of tonal music (r = 0.64). In the second experiment, we seek to assess the accuracy of the edit distance model in predicting relatively dissimilar rhythms. The stimuli used are random permutations of the son's inter-onset intervals: 3-3-4-2-4. The results again indicate that the edit distance correlates well with the perceived rhythmic dissimilarity judgments of the subjects (r = 0.76). To gain insight in the relationships between the individual rhythms, the results are also presented by means of graphic phylogenetic trees.

[1]  David Huron,et al.  An Empirical Study of Syncopation in American Popular Music, 1890–1939 , 2006 .

[2]  Gary P. Scavone,et al.  The sonic mapper: An interactive program for obtaining similarity ratings with auditory stimuli , 2002 .

[3]  David Sankoff,et al.  Time Warps, String Edits, and Macromolecules: The Theory and Practice of Sequence Comparison , 1983 .

[4]  Daniel H. Huson,et al.  SplitsTree-a program for analyzing and visualizing evolutionary data , 1997 .

[5]  O Gascuel,et al.  BIONJ: an improved version of the NJ algorithm based on a simple model of sequence data. , 1997, Molecular biology and evolution.

[6]  R. Jackendoff,et al.  A Generative Theory of Tonal Music , 1985 .

[7]  Godfried T. Toussaint,et al.  A Comparison of Rhythmic Similarity Measures , 2004, ISMIR.

[8]  Patricia E. Sink Effects of Rhythmic and Melodic Alterations and Selected Musical Experiences on Rhythmic Processing , 1984 .

[9]  Godfried T. Toussaint,et al.  A Mathematical Analysis of African, Brazilian, and Cuban Clave Rhythms , 2002 .

[10]  C. Krumhansl,et al.  Mental representations for musical meter. , 1990, Journal of experimental psychology. Human perception and performance.

[11]  G. Toussaint The Rhythm that Conquered the World : What Makes a “ Good ” Rhythm Good ? , 2010 .

[12]  Joseph B. Kruskal,et al.  Time Warps, String Edits, and Macromolecules , 1999 .

[13]  Yves Van de Peer,et al.  zt: A Sofware Tool for Simple and Partial Mantel Tests , 2002 .

[14]  Effects of Rhythmic and Melodic Alterations on Rhythmic Perception , 1983 .

[15]  M. Pearce,et al.  Sweet Anticipation : Music and the Psychology of Expectation , 2007 .

[16]  Daniel H. Huson,et al.  SplitsTree: analyzing and visualizing evolutionary data , 1998, Bioinform..

[17]  S. Trehub,et al.  Metrical Categories in Infancy and Adulthood , 2005, Psychological science.

[18]  Leigh M. Smith Rhythmic Similarity Using Metrical Profile Matching , 2010, ICMC.