Show simple item record

dc.contributor.authorKellnhofer, Petr
dc.contributor.authorDidyk, Piotr
dc.contributor.authorMyszkowski, Karol
dc.contributor.authorHefeeda, Mohamed M.
dc.contributor.authorSeidel, Hans-Peter
dc.contributor.authorMatusik, Wojciech
dc.date.accessioned2017-08-31T18:09:30Z
dc.date.available2017-08-31T18:09:30Z
dc.date.issued2016-07
dc.identifier.issn0730-0301
dc.identifier.urihttp://hdl.handle.net/1721.1/111081
dc.description.abstractProducing a high quality stereoscopic impression on current displays is a challenging task. The content has to be carefully prepared in order to maintain visual comfort, which typically affects the quality of depth reproduction. In this work, we show that this problem can be significantly alleviated when the eye fixation regions can be roughly estimated. We propose a new method for stereoscopic depth adjustment that utilizes eye tracking or other gaze prediction information. The key idea that distinguishes our approach from the previous work is to apply gradual depth adjustments at the eye fixation stage, so that they remain unnoticeable. To this end, we measure the limits imposed on the speed of disparity changes in various depth adjustment scenarios, and formulate a new model that can guide such seamless stereoscopic content processing. Based on this model, we propose a real-time controller that applies local manipulations to stereoscopic content to find the optimum between depth reproduction and visual comfort. We show that the controller is mostly immune to the limitations of low-cost eye tracking solutions. We also demonstrate benefits of our model in off-line applications, such as stereoscopic movie production, where skillful directors can reliably guide and predict viewers' attention or where attended image regions are identified during eye tracking sessions. We validate both our model and the controller in a series of user experiments. They show significant improvements in depth perception without sacrificing the visual quality when our techniques are applied.en_US
dc.language.isoen_US
dc.publisherAssociation for Computing Machinery (ACM)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1145/2897824.2925866en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceOther univ. web domainen_US
dc.titleGazeStereo3D: seamless disparity manipulationsen_US
dc.typeArticleen_US
dc.identifier.citationKellnhofer, Petr, et al. “GazeStereo3D: seamless disparity manipulations.” ACM Transactions on Graphics 35, 4 (July 2016): 1–13 © 2016 The Authorsen_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.mitauthorKellnhofer, Petr
dc.contributor.mitauthorMatusik, Wojciech
dc.relation.journalACM Transactions on Graphicsen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dspace.orderedauthorsKellnhofer, Petr; Didyk, Piotr; Myszkowski, Karol; Hefeeda, Mohamed M.; Seidel, Hans-Peter; Matusik, Wojciechen_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0003-0212-5643
mit.licenseOPEN_ACCESS_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record