Show simple item record

dc.contributor.advisorLee, Chong U.
dc.contributor.advisorLim, Jae S.
dc.contributor.authorTao, Julius L.
dc.date.accessioned2024-09-03T21:13:49Z
dc.date.available2024-09-03T21:13:49Z
dc.date.issued2024-05
dc.date.submitted2024-07-11T14:36:17.388Z
dc.identifier.urihttps://hdl.handle.net/1721.1/156638
dc.description.abstractEye contact is an essential social cue that conveys our attention to others but is difficult to maintain during video calls. Many existing methods to synthesize a gaze-corrected view involve estimating a 3D face model and projecting it into the desired camera view, which is too computationally expensive for most personal computers. By drawing inspiration from 2D methods of video frame interpolation, we wish to not only correct eye gaze but also better align the face towards the camera without this expensive 3D modeling. Our findings suggest that adding a second webcam opposite the first and interpolating between the two outer camera views can give realistic, gaze-aligned center views. We conclude that the prevailing approach of 3D modeling is surprisingly not necessary for gaze correction. Not only do 2D techniques suffice, but their synthesized frames can appear more natural than prior results. We believe that this work is a crucial step towards true-to-life viewpoint shift for live video conferences.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://rightsstatements.org/page/InC-EDU/1.0/
dc.titleMotion-Compensated Viewpoint Shift
dc.typeThesis
dc.description.degreeM.Eng.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeMaster
thesis.degree.nameMaster of Engineering in Electrical Engineering and Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record