Interactive search fusion methods for video database retrieval
Abstract
In this paper, we investigate a new method for video database retrieval using interactive search fusion. Recent video analysis techniques have enabled the extraction of a variety of descriptors of features, concepts, clusters, classification results, speech and textual terms, MPEG-7 metadata, and so on. However, given an information need users are faced with a daunting task of trying to formulate queries over these multiple disparate data sources in order to retrieve the desired video content. In this paper, we explore a novel approach based on search fusion in which the user interactively builds a query by sequentially choosing among the descriptors and data sources and by selecting from various combining and score aggregation functions to fuse results of individual searches. For example, the system allows building of queries such as "retrieve video clips that have color of beach scenes, the detection of sky, and detection of water." In this paper we present the search fusion method and evaluate the performance on a large video database.