Supervised Classification With Variable Kernel Estimators

Assuming uniform losses, Bayesian classification leads to the lowest possible misclassification rate, by definition. In order to carry out supervised classification in the Bayes sense, kernel density estimators with variable width are utilized. Nevertheless, in their standard form, they would require each learnt pattern to be stored, which is often beyond the hardware specifications. For this reason the variable kernel algorithm proposed is based on clusters, determined so as to minimize the final misclassification rate. This rate is evaluated by a cross-validation type algorithm in order to avoid overfitting. Experimental results show that it is possible to determine the number of clusters and their parameters, and to take into account hardware constraints, unavoidable in neural implementations.