Content-based retrieval has become a very popular and also powerful paradigm for searching in multimedia collections, especially in large collections of images. However, such queries require that one or even several reference images are available prior to the start of the search process. These reference images must be close to the final result so that the user can take them to express her information need. If such reference images are not available or if the information need is covered only by parts of the query object, the result usually does not meet the user’s expectation. Therefore, more flexible user interfaces are needed that allow users to sketch a query image by hand drawings and to dynamically select regions of interest from a given query image. In this paper, we present a novel approach to query by sketch where interactive paper and image similarity search are seamlessly combined. It is based on the iPaper/iServer system of ETH Zurich and the ISIS/OSIRIS content-based image retrieval system of the University of Basel. The paper presents the integrated system which has already very successfully been applied to the development of an interactive museum catalogue. Moreover, it reports on ongoing activities that aim at extending the system to support handwritten sketches, gestures and/or dynamic region selection to make the retrieval process more flexible and less dependent from existing query objects.
Springmann, M., Ispas, A., Schuldt, H., Norrie, M.C. and Signer, B.: "Towards Query by Sketch", Second DELOS Conference on Digital Libraries, Pisa, Italy, December 2007