Incremental learning of mixture models for simultaneous estimation of class distribution and inter-class decision boundaries

In this paper, we propose a novel design of high performance Bayes classifier from a small number of observations. The two main challenges to obtain the classifier are the lack of the true functional form of the class-conditional density and the lack of enough data to estimate the parameters of the classifiers. Incremental learning of Gaussian mixture model (GMM) is used to mitigate the lack of the true functional form. Moreover, the classifier uses the training samples from all classes to evaluate the goodness of a particular mixture to be used as the classifier for a specific class. This selection process eases the difficulty of the accurate parameter estimation. Thus, the important trait of the proposed classifier is being able to estimate simultaneously class-conditional density and inter-class boundaries to arbitrary precision. Our experimental results show that the proposed classifier not only has better performance than the conventional classifiers but also requires fewer parameters.