Show simple item record

dc.contributor.advisorTorralba, Antonio
dc.contributor.authorRen, Jordan
dc.date.accessioned2023-07-31T19:37:35Z
dc.date.available2023-07-31T19:37:35Z
dc.date.issued2023-06
dc.date.submitted2023-06-06T16:35:20.512Z
dc.identifier.urihttps://hdl.handle.net/1721.1/151411
dc.description.abstractEmbodied environments act as a tool that enables various control tasks to be learned. Within these simulators, having realistic rendering and physics ensures that the sim2real gap for tasks isn’t too large. Current embodied environments focus mainly on small-scale or low-level tasks, without the capability to learn large-scale diverse tasks, and often lack the realism for a small sim2real gap. To address the shortcomings of current simulators, we propose VirtualCity, a large-scale embodied environment that enables the learning of high-level planning tasks with photo-realistic rendering and realistic physics. To interact with VirtualCity, we provide a user-friendly Python API that allows the modification, control, and observation of the environment and its agents within. Building this realistic environment brings us closer to adapting models trained in simulation to solve real-world tasks.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://rightsstatements.org/page/InC-EDU/1.0/
dc.titleSimulating Real-World Human Activities with VirtualCity: A Large-Scale Embodied Environment for 2D, 3D, and Language-Driven Tasks
dc.typeThesis
dc.description.degreeM.Eng.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeMaster
thesis.degree.nameMaster of Engineering in Electrical Engineering and Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record