Simulating Real-World Human Activities with VirtualCity: A Large-Scale Embodied Environment for 2D, 3D, and Language-Driven Tasks
Author(s)
Ren, Jordan![Thumbnail](/bitstream/handle/1721.1/151411/ren-jordanr1-meng-eecs-2023-thesis.pdf.jpg?sequence=3&isAllowed=y)
DownloadThesis PDF (85.95Mb)
Advisor
Torralba, Antonio
Terms of use
Metadata
Show full item recordAbstract
Embodied environments act as a tool that enables various control tasks to be learned. Within these simulators, having realistic rendering and physics ensures that the sim2real gap for tasks isn’t too large. Current embodied environments focus mainly on small-scale or low-level tasks, without the capability to learn large-scale diverse tasks, and often lack the realism for a small sim2real gap. To address the shortcomings of current simulators, we propose VirtualCity, a large-scale embodied environment that enables the learning of high-level planning tasks with photo-realistic rendering and realistic physics. To interact with VirtualCity, we provide a user-friendly Python API that allows the modification, control, and observation of the environment and its agents within. Building this realistic environment brings us closer to adapting models trained in simulation to solve real-world tasks.
Date issued
2023-06Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology