Demultiplexing illumination via low cost sensing and nanosecond coding
Author(s)Kadambi, Achuta; Bhandari, Ayush; Whyte, Refael; Dorrington, Adrian; Raskar, Ramesh
MetadataShow full item record
Several computer vision algorithms require a sequence of photographs taken in different illumination conditions, which has spurred development in the area of illumination multiplexing. Various techniques for optimizing the multiplexing process already exist, but are geared toward regular or high speed cameras. Such cameras are fast, but code on the order of milliseconds. In this paper we propose a fusion of two popular contexts, time of flight range cameras and illumination multiplexing. Time of flight cameras are a low cost, consumer-oriented technology capable of acquiring range maps at 30 frames per second. Such cameras have a natural connection to conventional illumination multiplexing strategies as both paradigms rely on the capture of multiple shots and synchronized illumination. While previous work on illumination multiplexing has exploited coding at millisecond intervals, we repurpose sensors that are ordinarily used in time of flight imaging to demultiplex via nanosecond coding strategies.
DepartmentMassachusetts Institute of Technology. Media Laboratory; Program in Media Arts and Sciences (Massachusetts Institute of Technology)
2014 IEEE International Conference on Computational Photography (ICCP)
Institute of Electrical and Electronics Engineers (IEEE)
Kadambi, Achuta, Ayush Bhandari, Refael Whyte, Adrian Dorrington, and Ramesh Raskar. “Demultiplexing Illumination via Low Cost Sensing and Nanosecond Coding.” 2014 IEEE International Conference on Computational Photography (ICCP) (May 2014), Intel, Santa Clara, USA.
Author's final manuscript