Show simple item record

dc.contributor.advisorSertac Karaman.en_US
dc.contributor.authorGuerra, Winter Joseph.en_US
dc.contributor.otherMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.en_US
dc.date.accessioned2020-03-24T15:36:04Z
dc.date.available2020-03-24T15:36:04Z
dc.date.copyright2019en_US
dc.date.issued2019en_US
dc.identifier.urihttps://hdl.handle.net/1721.1/124244
dc.descriptionThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.en_US
dc.descriptionThesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019en_US
dc.descriptionCataloged from student-submitted PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (pages 103-112).en_US
dc.description.abstractIn recent years, intensive research has centered around using small, perception-driven robotic systems (e.g. quadrotor vehicles) for complex tasks at operational speeds. Although much progress has been made towards that end in the fields of online-planning, fast-perception, and agile-control, most robotic systems are still confined to controlled laboratory settings due to cost, safety, and repeatability. In this thesis, we introduce a few novel contributions that we believe could assist the greater robotics community to bring their robotics systems out of the lab and into the real world. First, we introduce FlightGoggles, a photorealistic sensor simulator for perception-driven robotic vehicles. FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry and provides the ability to combine synthetic exteroceptive measurements generated in silico in real time and vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in flight in a motion-capture facility. Second, we present The Blackbird Dataset, a large-scale dataset for UAV perception in aggressive flight containing over 10 hours of flight data across 171 flights at velocities up to 13.8ms⁻¹ in 5 environments with some dynamic elements. Third, we introduce a virtual reality framework for FlightGoggles that enables multi-agent or robot-human interaction in a safe manner by superimposing position data from multiple motion capture spaces into a unified virtual reality environment. Fourth, we propose an extension of FlightGoggles using augmented reality for aircraft-in-the-loop experiments that aims to aid sim-to-real transfer from simulated to real-world cameras. Lastly, we study applications of FlightGoggles in the greater robotics community through the AlphaPilot autonomous drone racing challenge and survey approaches and results from the top AlphaPilot teams, which may be of independent interest.en_US
dc.description.statementofresponsibilityby Winter Joseph Guerra.en_US
dc.format.extent112 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectElectrical Engineering and Computer Science.en_US
dc.titlePhotorealistic sensor simulation for perception-driven robotics using virtual realityen_US
dc.typeThesisen_US
dc.description.degreeM. Eng.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.identifier.oclc1145019349en_US
dc.description.collectionM.Eng. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Scienceen_US
dspace.imported2020-03-24T15:36:03Zen_US
mit.thesis.degreeMasteren_US
mit.thesis.departmentEECSen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record