Demultiplexing illumination via low cost sensing and nanosecond coding
Author(s)
Kadambi, Achuta; Bhandari, Ayush; Whyte, Refael; Dorrington, Adrian; Raskar, Ramesh
DownloadRaskar_Demultiplexing.pdf (3.268Mb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Several computer vision algorithms require a sequence of photographs taken in different illumination conditions, which has spurred development in the area of illumination multiplexing. Various techniques for optimizing the multiplexing process already exist, but are geared toward regular or high speed cameras. Such cameras are fast, but code on the order of milliseconds. In this paper we propose a fusion of two popular contexts, time of flight range cameras and illumination multiplexing. Time of flight cameras are a low cost, consumer-oriented technology capable of acquiring range maps at 30 frames per second. Such cameras have a natural connection to conventional illumination multiplexing strategies as both paradigms rely on the capture of multiple shots and synchronized illumination. While previous work on illumination multiplexing has exploited coding at millisecond intervals, we repurpose sensors that are ordinarily used in time of flight imaging to demultiplex via nanosecond coding strategies.
Date issued
2014-05Department
Massachusetts Institute of Technology. Media Laboratory; Program in Media Arts and Sciences (Massachusetts Institute of Technology)Journal
2014 IEEE International Conference on Computational Photography (ICCP)
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Kadambi, Achuta, Ayush Bhandari, Refael Whyte, Adrian Dorrington, and Ramesh Raskar. “Demultiplexing Illumination via Low Cost Sensing and Nanosecond Coding.” 2014 IEEE International Conference on Computational Photography (ICCP) (May 2014), Intel, Santa Clara, USA.
Version: Author's final manuscript
ISBN
978-1-4799-5188-8