Team MIT Urban Challenge Technical Report
dc.contributor.advisor | John Leonard | en_US |
dc.contributor.author | Leonard, John | en_US |
dc.contributor.author | Barrett, David | en_US |
dc.contributor.author | How, Jonathan | en_US |
dc.contributor.author | Teller, Seth | en_US |
dc.contributor.author | Antone, Matt | en_US |
dc.contributor.author | Campbell, Stefan | en_US |
dc.contributor.author | Epstein, Alex | en_US |
dc.contributor.author | Fiore, Gaston | en_US |
dc.contributor.author | Fletcher, Luke | en_US |
dc.contributor.author | Frazzoli, Emilio | en_US |
dc.contributor.author | Huang, Albert | en_US |
dc.contributor.author | Jones, Troy | en_US |
dc.contributor.author | Koch, Olivier | en_US |
dc.contributor.author | Kuwata, Yoshiaki | en_US |
dc.contributor.author | Mahelona, Keoni | en_US |
dc.contributor.author | Moore, David | en_US |
dc.contributor.author | Moyer, Katy | en_US |
dc.contributor.author | Olson, Edwin | en_US |
dc.contributor.author | Peters, Steven | en_US |
dc.contributor.author | Sanders, Chris | en_US |
dc.contributor.author | Teo, Justin | en_US |
dc.contributor.author | Walter, Matthew | en_US |
dc.contributor.other | Robotics, Vision & Sensor Networks | en_US |
dc.date.accessioned | 2007-12-17T13:50:57Z | |
dc.date.available | 2007-12-17T13:50:57Z | |
dc.date.issued | 2007-12-14 | en_US |
dc.identifier.other | MIT-CSAIL-TR-2007-058 | en_US |
dc.identifier.uri | http://hdl.handle.net/1721.1/39822 | |
dc.description.abstract | This technical report describes Team MITs approach to theDARPA Urban Challenge. We have developed a novel strategy forusing many inexpensive sensors, mounted on the vehicle periphery,and calibrated with a new cross-modal calibrationtechnique. Lidar, camera, and radar data streams are processedusing an innovative, locally smooth state representation thatprovides robust perception for real time autonomous control. Aresilient planning and control architecture has been developedfor driving in traffic, comprised of an innovative combination ofwellproven algorithms for mission planning, situationalplanning, situational interpretation, and trajectory control. These innovations are being incorporated in two new roboticvehicles equipped for autonomous driving in urban environments,with extensive testing on a DARPA site visit course. Experimentalresults demonstrate all basic navigation and some basic trafficbehaviors, including unoccupied autonomous driving, lanefollowing using pure-pursuit control and our local frameperception strategy, obstacle avoidance using kino-dynamic RRTpath planning, U-turns, and precedence evaluation amongst othercars at intersections using our situational interpreter. We areworking to extend these approaches to advanced navigation andtraffic scenarios. | en_US |
dc.format.extent | 26 p. | en_US |
dc.relation | Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory | en_US |
dc.relation | en_US | |
dc.subject | autonomous vehicle | en_US |
dc.subject | robotics | en_US |
dc.subject | DARPA Grand Challenge | en_US |
dc.subject | path planning | en_US |
dc.subject | machine perception | en_US |
dc.subject | tracking | en_US |
dc.title | Team MIT Urban Challenge Technical Report | en_US |