A Dataset of Multi-Illumination Images in the Wild
Author(s)
Murmann, Lukas; Gharbi, Michael Yanis; Aittala, Miika; Durand, Frederic
DownloadAccepted version (3.108Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Collections of images under a single, uncontrolled illumination have enabled the rapid advancement of core computer vision tasks like classification, detection, and segmentation. But even with modern learning techniques, many inverse problems involving lighting and material understanding remain too severely ill-posed to be solved with single-illumination datasets. The data simply does not contain the necessary supervisory signals. Multi-illumination datasets are notoriously hard to capture, so the data is typically collected at small scale, in controlled environments, either using multiple light sources, or robotic gantries. This leads to image collections that are not representative of the variety and complexity of real world scenes. We introduce a new multi-illumination dataset of more than 1000 real scenes, each captured in high dynamic range and high resolution, under 25 lighting conditions. We demonstrate the richness of this dataset by training state-of-the-art models for three challenging applications: Single-image illumination estimation, image relighting, and mixed-illuminant white balance.
Date issued
2019-11Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Proceedings of the IEEE International Conference on Computer Vision
Publisher
IEEE
Citation
Murmann, Lukas et al. “A Dataset of Multi-Illumination Images in the Wild.” Paper in the Proceedings of the IEEE International Conference on Computer Vision, 2019-October, 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea (South), 27 Oct.-2 Nov. 2019, IEEE © 2019 The Author(s)
Version: Author's final manuscript
ISSN
1063-6919