Interactive image retrieval by mental matching

In "query-by-visual-example," the standard scenario in image retrieval, the "query image" resides in the database and is matched by the system with other images. Suppose, instead, the query image is "external" and the matching is "mental." For instance, an image resides in the mind of the user or there is an actual object or photograph. The user may seek versions of the same image, e.g., the same face, or images belonging to the same class, e.g., similar landscapes, and responds to a sequence of machine-generated queries designed to accelerate the search. For example, the user declares which of several displayed images is "closest" to his query. These similarity decisions are entirely subjective and user-dependent. I will discuss an interactive search engine which is based on information theory and statistical inference. The display algorithm involves a Bayesian relevance feedback model and an optimality criterion based on conditional entropy. Performance is measured by the expected number of iterations necesary to match the identity (target search) or the class (category search) of the query. Designing metrics and response models which are consistent with human behavior is essential for achieving practical results with large databases, as illustrated with art and faces.

[1]  H. Duggan Information--storage and retrieval. , 1967, Journal of the Canadian Association of Radiologists.