Show simple item record

dc.contributor.authorNoraky, James
dc.contributor.authorSze, Vivienne
dc.date.accessioned2021-03-09T16:52:26Z
dc.date.available2021-03-09T16:52:26Z
dc.date.issued2020-06
dc.identifier.issn1051-8215
dc.identifier.issn1558-2205
dc.identifier.urihttps://hdl.handle.net/1721.1/130109
dc.description.abstractDepth sensing is useful in a variety of applications that range from augmented reality to robotics. Time-of-flight (TOF) cameras are appealing because they obtain dense depth measurements with minimal latency. However, for many battery-powered devices, the illumination source of a TOF camera is power hungry and can limit the battery life of the device. To address this issue, we present an algorithm that lowers the power for depth sensing by reducing the usage of the TOF camera and estimating depth maps using concurrently collected images. Our technique also adaptively controls the TOF camera and enables it when an accurate depth map cannot be estimated. To ensure that the overall system power for depth sensing is reduced, we design our algorithm to run on a low power embedded platform, where it outputs 640 × 480 depth maps at 30 frames per second. We evaluate our approach on several RGB-D datasets, where it produces depth maps with an overall mean relative error of 0.96% and reduces the usage of the TOF camera by 85%. When used with commercial TOF cameras, we estimate that our algorithm can lower the total power for depth sensing by up to 73%.en_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/tcsvt.2019.2907904en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceProf. Sze via Phoebe Ayersen_US
dc.titleLow Power Depth Estimation of Rigid Objects for Time-of-Flight Imagingen_US
dc.typeArticleen_US
dc.identifier.citationNoraky, James and Vivienne Sze. "Low Power Depth Estimation of Rigid Objects for Time-of-Flight Imaging." IEEE Transactions on Circuits and Systems for Video Technology 30, 6 (June 2020): 1524 - 1534. © 2020 IEEEen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.relation.journalIEEE Transactions on Circuits and Systems for Video Technologyen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.date.submission2021-03-05T15:25:26Z
mit.journal.volume30en_US
mit.journal.issue6en_US
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record