On Combining Dissimilarity Representations

For learning purposes, representations of real world objects can be built by using the concept of dissimilarity (distance). In such a case, an object is characterized in a relative way, i.e. by its dissimilarities to a set of the selected prototypes. Such dissimilarity representations are found to be more practical for some pattern recognition problems. When experts cannot decide for a single dissimilarity measure, a number of them may be studied in parallel. We investigate two possibilities of combining either dissimilarity representations themselves or classifiers built on each of them separately. Our experiments conducted on a handwritten digit set demonstrate that when the dissimilarity representations are of different nature, a much better performance can be obtained by their combination than on individual representations.

[1]  Josef Kittler,et al.  Combining multiple classifiers by averaging or by multiplying? , 2000, Pattern Recognit..

[2]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  R. Casey,et al.  Advances in Pattern Recognition , 1971 .

[4]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[5]  Robert P. W. Duin,et al.  Classifier Conditional Posterior Probabilities , 1998, SSPR/SPR.

[6]  Anil K. Jain,et al.  A modified Hausdorff distance for object matching , 1994, Proceedings of 12th International Conference on Pattern Recognition.

[7]  Robert P. W. Duin,et al.  Classifiers for dissimilarity-based pattern recognition , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[8]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[9]  Robert P. W. Duin,et al.  Classifiers in almost empty spaces , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[10]  R. Duin,et al.  Automatic pattern recognition by similarity representations , 2001 .