Motion-Compensated Viewpoint Shift
Author(s)
Tao, Julius L.
DownloadThesis PDF (39.38Mb)
Advisor
Lee, Chong U.
Lim, Jae S.
Terms of use
Metadata
Show full item recordAbstract
Eye contact is an essential social cue that conveys our attention to others but is difficult to maintain during video calls. Many existing methods to synthesize a gaze-corrected view involve estimating a 3D face model and projecting it into the desired camera view, which is too computationally expensive for most personal computers. By drawing inspiration from 2D methods of video frame interpolation, we wish to not only correct eye gaze but also better align the face towards the camera without this expensive 3D modeling. Our findings suggest that adding a second webcam opposite the first and interpolating between the two outer camera views can give realistic, gaze-aligned center views. We conclude that the prevailing approach of 3D modeling is surprisingly not necessary for gaze correction. Not only do 2D techniques suffice, but their synthesized frames can appear more natural than prior results. We believe that this work is a crucial step towards true-to-life viewpoint shift for live video conferences.
Date issued
2024-05Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology