| dc.contributor.advisor | Fredo Durand. | en_US |
| dc.contributor.author | Gross, Lee, M. Eng. Massachusetts Institute of Technology | en_US |
| dc.contributor.other | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. | en_US |
| dc.date.accessioned | 2016-12-22T15:18:41Z | |
| dc.date.available | 2016-12-22T15:18:41Z | |
| dc.date.copyright | 2016 | en_US |
| dc.date.issued | 2016 | en_US |
| dc.identifier.uri | http://hdl.handle.net/1721.1/106016 | |
| dc.description | Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016. | en_US |
| dc.description | This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. | en_US |
| dc.description | Cataloged from student-submitted PDF version of thesis. | en_US |
| dc.description | Includes bibliographical references (page 53). | en_US |
| dc.description.abstract | Drones are often used for aerial photography. The main way people currently control them is using a joystick that controls the pitch, yaw, roll and throttle of the drone. With the joystick, a user has to think not only in terms of the desired image but also how the drone needs to move in order to capture it. We implemented a gestural system that would allow the user to control a drone in a more intuitive manner that abstracts away low-level motor controls. The idea is to allow a user to manipulate the current frame using touch gestures to indicate the intended shot. The four gestures we support are dragging, pinching, two finger rotation and two finger drag. The feedback consists of the live feed from the camera, as well as a preview of the image transformation that was indicated. When the user lifts their fingers, the commands are sent to the drone so it can execute the movements that will result in the desired image. The system works but has some limitations that are imposed by the drones flight-path API. These limitations deal with the path and the heading used during flight. The drone's final view is qualitatively close to the desired image as indicated by the gesture. | en_US |
| dc.description.statementofresponsibility | by Lee Gross. | en_US |
| dc.format.extent | 53 pages | en_US |
| dc.language.iso | eng | en_US |
| dc.publisher | Massachusetts Institute of Technology | en_US |
| dc.rights | M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission. | en_US |
| dc.rights.uri | http://dspace.mit.edu/handle/1721.1/7582 | en_US |
| dc.subject | Electrical Engineering and Computer Science. | en_US |
| dc.title | Multi-touch through-the-lens drone control | en_US |
| dc.type | Thesis | en_US |
| dc.description.degree | M. Eng. | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
| dc.identifier.oclc | 965829205 | en_US |