Annotation Propagation in Large Image Databases via Dense Image Correspondence
Author(s)
Rubinstein, Michael; Liu, Ce; Freeman, William T.
DownloadAccepted version (6.660Mb)
Terms of use
Metadata
Show full item recordAbstract
Our goal is to automatically annotate many images with a set of word tags and a pixel-wise map showing where each word tag occurs. Most previous approaches rely on a corpus of training images where each pixel is labeled. However, for large image databases, pixel labels are expensive to obtain and are often unavailable. Furthermore, when classifying multiple images, each image is typically solved for independently, which often results in inconsistent annotations across similar images. In this work, we incorporate dense image correspondence into the annotation model, allowing us to make do with significantly less labeled data and to resolve ambiguities by propagating inferred annotations from images with strong local visual evidence to images with weaker local evidence. We establish a large graphical model spanning all labeled and unlabeled images, then solve it to infer annotations, enforcing consistent annotations over similar visual patterns. Our model is optimized by efficient belief propagation algorithms embedded in an expectation-maximization (EM) scheme. Extensive experiments are conducted to evaluate the performance on several standard large-scale image datasets, showing that the proposed framework outperforms state-of-the-art methods. © 2012 Springer-Verlag.
Date issued
2012Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence LaboratoryPublisher
Springer Nature
Citation
Rubinstein, Michael, Liu, Ce and Freeman, William T. 2012. "Annotation Propagation in Large Image Databases via Dense Image Correspondence."
Version: Author's final manuscript
ISSN
0302-9743
1611-3349