Distilled Collections from Textual Image Queries

Hadar Averbuch-Elor, Yunhai Wang*, Yiming Qian, Minglun Gong, Johannes Kopf, Hao Zhang, Daniel Cohen-Or

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


We present a distillation algorithm which operates on a large, unstructured, and noisy collection of internet images returned from an online object query. We introduce the notion of a distilled set, which is a clean, coherent, and structured subset of inlier images. In addition, the object of interest is properly segmented out throughout the distilled set. Our approach is unsupervised, built on a novel clustering scheme, and solves the distillation and object segmentation problems simultaneously. In essence, instead of distilling the collection of images, we distill a collection of loosely cutout foreground "shapes", which may or may not contain the queried object. Our key observation, which motivated our clustering scheme, is that outlier shapes are expected to be random in nature, whereas, inlier shapes, which do tightly enclose the object of interest, tend to be well supported by similar shapes captured in similar views. We analyze the commonalities among candidate foreground segments, without aiming to analyze their semantics, but simply by clustering similar shapes and considering only the most significant clusters representing non-trivial shapes. We show that when tuned conservatively, our distillation algorithm is able to extract a near perfect subset of true inliers. Furthermore, we show that our technique scales well in the sense that the precision rate remains high, as the collection grows. We demonstrate the utility of our distillation results with a number of interesting graphics applications.

Original languageEnglish
Pages (from-to)131-142
Number of pages12
JournalComputer Graphics Forum
Issue number2
StatePublished - 1 May 2015


Dive into the research topics of 'Distilled Collections from Textual Image Queries'. Together they form a unique fingerprint.

Cite this