<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
<title>Robotics Research Datasets</title>
<link href="https://hdl.handle.net/1721.1/62234" rel="alternate"/>
<subtitle/>
<id>https://hdl.handle.net/1721.1/62234</id>
<updated>2026-04-06T03:23:51Z</updated>
<dc:date>2026-04-06T03:23:51Z</dc:date>
<entry>
<title>usc_sal200_synthetic</title>
<link href="https://hdl.handle.net/1721.1/62292" rel="alternate"/>
<author>
<name>Vaughan, Richard</name>
</author>
<id>https://hdl.handle.net/1721.1/62292</id>
<updated>2019-04-05T22:33:50Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">usc_sal200_synthetic
Vaughan, Richard
A floorplan drawn from (human) measurements of the second floor of the SAL computer science building at USC. Door frames, etc are not included. Real laser scan data from this environment is  also available. Distributed with the Stage multi-robot simulator. This map has appeared in several published papers.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>albert-b-laser-vision</title>
<link href="https://hdl.handle.net/1721.1/62291" rel="alternate"/>
<author>
<name>Stachniss, Cyrill</name>
</author>
<id>https://hdl.handle.net/1721.1/62291</id>
<updated>2019-04-06T10:34:30Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">albert-b-laser-vision
Stachniss, Cyrill
The dataset was recorded with the b21r robot Albert in Building 79, University of Freiburg. I recorded the laser range data and vision data (320x240, ~2-3fps, jpg, 65 deg fov) including the SIFT features exracted from the images. The dataset is the same than the "albert-B-laser" dataset but contains the additionally the images from the camera.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>fr079</title>
<link href="https://hdl.handle.net/1721.1/62290" rel="alternate"/>
<author>
<name>Stachniss, Cyrill</name>
</author>
<id>https://hdl.handle.net/1721.1/62290</id>
<updated>2019-04-06T09:33:12Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">fr079
Stachniss, Cyrill
Date : 2003-11-18 Where : Univ of Freiburg, building 079, AIS-Lab Robot-type : Pioneer2 with 1 LMS-Laser What : 1 robot (magnum with stayton board) was driving around in building 079 File format : carmen logger format (POS, FLASER), only laser and odometry data recorded, and a scanmatched rec &amp; carmen log file File(s) : fr079.log.gz fr079-sm.log.gz fr079-map.tgz
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>fr101-explored</title>
<link href="https://hdl.handle.net/1721.1/62289" rel="alternate"/>
<author>
<name>Stachniss, Cyrill</name>
</author>
<id>https://hdl.handle.net/1721.1/62289</id>
<updated>2019-04-08T08:11:12Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">fr101-explored
Stachniss, Cyrill
- Entrance Hall (Geb 101) at the Dept. of CS at teh University of Freiburg - Data-Set is recorded during autonomous exploration - Pioneer 2 DX8 robot with one SICK laser range finder - CARMEN was used - date: 2003-10-16
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>robonaut_sensor</title>
<link href="https://hdl.handle.net/1721.1/62288" rel="alternate"/>
<author>
<name>Jenkins, Chad</name>
</author>
<id>https://hdl.handle.net/1721.1/62288</id>
<updated>2019-04-08T07:12:54Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">robonaut_sensor
Jenkins, Chad
Sensor data published from the NASA Robonaut during human teleoperation, originally collected by Alan Peters. Robonaut was teleoperated to perform 45 grasps trials of a vertically oriented wrench. The wrench was placed at 9 different locations in the robot workspace with 5 trials to each location. The provided data contains readings from various force and tactile sensors on the arm and hand on Robonaut used grasping.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>intel_lab</title>
<link href="https://hdl.handle.net/1721.1/62287" rel="alternate"/>
<author>
<name>Fox, Dieter</name>
</author>
<id>https://hdl.handle.net/1721.1/62287</id>
<updated>2019-04-05T13:19:02Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">intel_lab
Fox, Dieter
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>fr_campus</title>
<link href="https://hdl.handle.net/1721.1/62286" rel="alternate"/>
<author>
<name>Grisetti, Giorgio</name>
</author>
<author>
<name>Stachniss, Cyrill</name>
</author>
<id>https://hdl.handle.net/1721.1/62286</id>
<updated>2019-04-08T08:11:12Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">fr_campus
Grisetti, Giorgio; Stachniss, Cyrill
log-file: fr-campus-20040714.carmen.log format : raw robot data file in the carmen logger format (ODOM, FLASER, PARAM) robot : Pioneer2AT (herbert) with one SICK LMS291-S05 (FLASER) date : 14 July 2004 place : campus of the Department of Computer Science, Univ of Freiburg size : approx. 250m x 250m, robot traveled 1.750km (~ 1.088 miles) creators: Cyrill Stachniss &amp; Giorgio Grisetti map(s) : carmen-map-files: fr_campus_100p_10cm.cmf - 100 particles, 10cm grid resolution fr_campus_100p_30cm.cmf - 100 particles, 30cm grid resolution fr_campus_30p_10cm.cmf - 30 particles, 10cm grid resolution fr_campus_30p_30cm.cmf - 30 particles, 30cm grid resolution image-files: fr_campus_100p_10cm.png fr_campus_100p_30cm.png fr_campus_100p_30cm_path.png (including the path of the robot) fr_campus_30p_10cm.png fr_campus_30p_30cm.png The maps have been created with Giorgio Grisetti's great Rao-Blackwellized mapper. For more information see: Cyrill Stachniss, Giorgio Grisetti, Dirk Haehnel, and Wolfram Burgard. "Improved Rao-Blackwellized Mapping by Adaptive Techniques and Active Loop-Closure" (SOAVE04, SelfOrganization of AdaptiVE behavior 04). We generated the maps with two different particle sets. The first map was computed with 100 particles which produces really nice results. Due to Giorgio's optimization he is able to build the map with only 30(!) particles and the result is nearly as good as the one 100 particles. comments: This is an outdoor data-set with some moving objects (cars people, bikes), with some trees, bushes, a lot of free space and of course not 100% plain surfaces....
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>sal200_wifi</title>
<link href="https://hdl.handle.net/1721.1/62285" rel="alternate"/>
<author>
<name>Howard, Andrew</name>
</author>
<id>https://hdl.handle.net/1721.1/62285</id>
<updated>2019-04-08T07:24:22Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">sal200_wifi
Howard, Andrew
Wireless signal-strength data from the USC SAL200 building. Four 802.11b devices were placed in the environment to act as beacons, and two robots were used to gather signal strength values. The robots were localized using laser-based MCL to provided "ground truth" pose esimates. See [Howard, FSR03] for an experimental study of WiFi signal-strength-based localization.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>Longwood</title>
<link href="https://hdl.handle.net/1721.1/62284" rel="alternate"/>
<author>
<name>Roy, Nick</name>
</author>
<id>https://hdl.handle.net/1721.1/62284</id>
<updated>2019-04-07T01:53:17Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">Longwood
Roy, Nick
The Longwood at Oakmont nursing home. This map was constructed out of two separate data runs, where the data was collected using an old robot control software package called beeSoft. Two separate maps were created from the data, and then merged. The map resolution is 10cm.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>utk-claxton</title>
<link href="https://hdl.handle.net/1721.1/62283" rel="alternate"/>
<author>
<name>Bailey, Michael</name>
</author>
<id>https://hdl.handle.net/1721.1/62283</id>
<updated>2019-04-06T08:06:52Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">utk-claxton
Bailey, Michael
Several runs of the Claxton CS building at the University of Tennessee, Knoxville. The data was collected from an RWI ATRV-Mini equipped with a SICK scanning laser range finder, using the Player writelog driver. The laser was mounted approximately 30cm forward from the robot's center of motion.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>cogniron</title>
<link href="https://hdl.handle.net/1721.1/62282" rel="alternate"/>
<author>
<name>Krose, Ben</name>
</author>
<id>https://hdl.handle.net/1721.1/62282</id>
<updated>2019-04-06T10:20:51Z</updated>
<published>2011-02-25T00:00:00Z</published>
<summary type="text">cogniron
Krose, Ben
A dataset is recorded consisting of omnidirectional images, laser range data, sonar readings and robot odometry. The dataset is meant for testing and benchmarking methods proposed in the papers sent to the workshop, as well as stimulating the discussion about spatial representations for mobile robots.

The acquisition of the dataset took place in a home environment constructed by UNET (http://www.unet.nl/, linked site is in dutch). A Nomad scout was driven around by tele-operation while data was collected on a mounted laptop. The robot was driven through the environment three times, one "clean" run with as low noise level as possible, one "noisy" run with people walking through the environment and one "home-tour" run, in which a person is guiding the robot around the house. Each run took around 5 minutes. The data is not provided via radish due to file sizes, but can be accessed at
http://www2.science.uva.nl/sites/cogniron/.

More file format information is available at
http://staff.science.uva.nl/~zivkovic/FS2HSC/dataset.html.
</summary>
<dc:date>2011-02-25T00:00:00Z</dc:date>
</entry>
<entry>
<title>usc-sal200-021120</title>
<link href="https://hdl.handle.net/1721.1/62281" rel="alternate"/>
<author>
<name>Howard, Andrew</name>
</author>
<id>https://hdl.handle.net/1721.1/62281</id>
<updated>2019-04-06T08:28:04Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">usc-sal200-021120
Howard, Andrew
These files contain the raw sensor data acquired by a robot during two separate tours of the second floor of the USC SAL building. The robot was equipped as follows: * Pioneer2DX (odometry) * Sick LMS 200 (laser range finder); mounted 8cm forward from the center of the robot, facing forwards. The robot was teleoperated by a human operator.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>isr-fctuc_lrf1_lrf2_cam_imu_carmen_dataset</title>
<link href="https://hdl.handle.net/1721.1/62280" rel="alternate"/>
<author>
<name>Davim, Luis</name>
</author>
<author>
<name>Ferreira, Filipe</name>
</author>
<author>
<name>Dias, Jorge Manuel Miranda</name>
</author>
<author>
<name>Prado, Jose</name>
</author>
<id>https://hdl.handle.net/1721.1/62280</id>
<updated>2019-04-06T08:48:18Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">isr-fctuc_lrf1_lrf2_cam_imu_carmen_dataset
Davim, Luis; Ferreira, Filipe; Dias, Jorge Manuel Miranda; Prado, Jose
Carmen log files for indoor environment (approx 80x30 m), of sequences of multi-sensor data. Overlapping sequences of environment were scanned to demonstrate the difficulty of registering data using odometry and to test loop-closing ability of map-building algorithms. Sensor data includes range scans from horizontal SICK LMS 200 (RAWLASER1), vertical SICK LMS 200(RAWLASER2), Xsens MTI 400 IMU (XSENS) , Guppy camera (ISRCAM) and Segway RMP 200 pose data (ODOM &amp; SEGPOS) Images and thumbnail created using CARMEN module 'Vasco' (included [Vasco]corrected odometry data in the data log-file)
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>aic1_sonar_data_simulated_noloc</title>
<link href="https://hdl.handle.net/1721.1/62279" rel="alternate"/>
<author>
<name>O'Sullivan, Shane</name>
</author>
<id>https://hdl.handle.net/1721.1/62279</id>
<updated>2019-04-06T09:32:37Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">aic1_sonar_data_simulated_noloc
O'Sullivan, Shane
This is sonar data taken during a simulated run around the AIC environment provided with the Saphira simulator using a Pioneer 1 robot. There are 4 files in the Zip file attached. aic1_SimulatedSonarData_NoError_NoLoc.run contains the sonar data. MapViewer_Tutorial 4_Building_Maps_From_Sonar_Data.pdf explains the file format. aic1_SimulatedSonarData_NoError_NoLoc_50mm.map contains the generated map in Carmen map format - http://www.cs.cmu.edu/~carmen, with a resolution of 50mm, meaning each grid cell is 50mm wide and 50mm high. This file can be opened using either Carmen or MapViewer, available at http://mapviewer.skynet.ie RobotRunFileHelper.h is a C++ class that can read and write the sonar data files. More information and sample usages of it can be found in the Tutorial4 zip file at http://mapviewer.skynet.ie The layout of the sonars is detailed in the .run file (this is also covered in the PDF). MapViewer can be used to build maps from this sonar data (at any resolution), and also to simply show the path taken by the robot in the environment. For the test run, odometry error was turned off, meaning localisation was not needed. However, error in the sonar readings is still present.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>cave_bitmap</title>
<link href="https://hdl.handle.net/1721.1/62278" rel="alternate"/>
<author>
<name>Vaughan, Richard</name>
</author>
<id>https://hdl.handle.net/1721.1/62278</id>
<updated>2019-04-08T07:15:23Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">cave_bitmap
Vaughan, Richard
A simple hand-drawn environment with a few rough obstacles. Distributed with the Stage multi-robot simulator. This map has appeared in several published papers.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>ualberta-csc-flr3-vision</title>
<link href="https://hdl.handle.net/1721.1/62277" rel="alternate"/>
<author>
<name>Klippenstein, Jonathan</name>
</author>
<id>https://hdl.handle.net/1721.1/62277</id>
<updated>2019-04-08T08:11:12Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">ualberta-csc-flr3-vision
Klippenstein, Jonathan
These images were recorded at the Computing Science Centre of the University of Alberta. They comprise a loop around the third floor of the building. Images were captured with a Dragonfly IEEE1394 digital camera from Point Grey Research,from an iRobot Magellan Pro robot. An image was taken after an approximate 15 cm translation or 5 degree rotation, whichever came first. An upwards-pointing camera was used for observing the robot position by tracking a repeating pattern of ceiling tiles. As such, every image has an associated position and covariance matrix. Each of the 512 images has three associated files: imageNNNN.png - rectified image (radial distortion has been removed). imageNNNN.state - position image was taken from, along with covariance matrix. imageNNNN.sift - SIFT keys. More information can be found in the README file in the archive. For convenience, the SIFT keys are packaged in a separate file.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>A_building</title>
<link href="https://hdl.handle.net/1721.1/62276" rel="alternate"/>
<author>
<name>Vincent, Regis</name>
</author>
<id>https://hdl.handle.net/1721.1/62276</id>
<updated>2019-04-08T08:11:12Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">A_building
Vincent, Regis
Built using a p2 with LRF alone. The main corridor is 123m long. This main will be expended later, and re-updated as we test more and more for the Centibots project.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>vasche_library_floor1</title>
<link href="https://hdl.handle.net/1721.1/62275" rel="alternate"/>
<author>
<name>Tews, Ashley</name>
</author>
<id>https://hdl.handle.net/1721.1/62275</id>
<updated>2019-04-08T08:11:12Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">vasche_library_floor1
Tews, Ashley
This file was created from the floorplan of the Vasche library. The original image is included in the Files area below or can be downloaded at: http://www.csustan.edu/directories/Maps_n_Plans/Campus_Plans/Building/001-Lib-sml-1flr.html along with the 2nd and 3rd floor maps.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>mit-csail-3rd-floor</title>
<link href="https://hdl.handle.net/1721.1/62274" rel="alternate"/>
<author>
<name>Stachniss, Cyrill</name>
</author>
<id>https://hdl.handle.net/1721.1/62274</id>
<updated>2019-04-05T21:50:34Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">mit-csail-3rd-floor
Stachniss, Cyrill
location: MIT CSAIL, 3rd floor time: 2005-12-17 robot: b21r laser: sick lms, 80m recorded by: Cyrill Stachniss format: Carmen log file in the old and the new format (ODOM, FLASER as well as RAWLASER and ROBOTLASER1 messages)
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>kwing_wld</title>
<link href="https://hdl.handle.net/1721.1/62273" rel="alternate"/>
<author>
<name>Vincent, Regis</name>
</author>
<id>https://hdl.handle.net/1721.1/62273</id>
<updated>2019-04-06T12:23:28Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">kwing_wld
Vincent, Regis
SRI AIC K wing, map built-in from several robots and match using Dr. Steffen Gutmann\'s ScanStudio.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>kenmore-pradoroof</title>
<link href="https://hdl.handle.net/1721.1/62272" rel="alternate"/>
<author>
<name>Bosse, Michael</name>
</author>
<id>https://hdl.handle.net/1721.1/62272</id>
<updated>2019-04-06T02:39:48Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">kenmore-pradoroof
Bosse, Michael
Laser range data collected from two SICK LMS lasers mounted on the roof of a Toyota Prado driving through suburban streets in Kenmore, QLD, Australia. The data set is about 40 minutes long with a path length of about 18 km. There is no odometry or GPS for this data, but the laser scans are dense enough to make detailed maps and close large loops. The figure was generated from the optimized output of our scanmatched based SLAM algorithm under the Atlas framework, and manually overlayed with a satellite image.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>csc_mezzanine</title>
<link href="https://hdl.handle.net/1721.1/62271" rel="alternate"/>
<author>
<name>Howard, Andrew</name>
</author>
<id>https://hdl.handle.net/1721.1/62271</id>
<updated>2019-04-05T18:55:17Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">csc_mezzanine
Howard, Andrew
This file contains the raw sensor data acquired by a single robot during a tour of the mezzanine level of the California Science Center. The robot was equipped as follows: * Pioneer2DX (odometry) * Sick LMS 200 (laser range finder); mounted 8cm forward from the center of the robot, facing forwards. The robot was teleoperated by a human operator.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>acapulco_convention_center</title>
<link href="https://hdl.handle.net/1721.1/62270" rel="alternate"/>
<author>
<name>Roy, Nick</name>
</author>
<id>https://hdl.handle.net/1721.1/62270</id>
<updated>2019-04-06T00:03:06Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">acapulco_convention_center
Roy, Nick
This map was constructed by GRACE, a B21, using a Sick LMS laser range finder. The map file was constructed from the log file, and is 10cm resolution.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>mitstata3rdfloordreyfoos</title>
<link href="https://hdl.handle.net/1721.1/62269" rel="alternate"/>
<author>
<name>Missiuro, Patrycja Ewelina</name>
</author>
<id>https://hdl.handle.net/1721.1/62269</id>
<updated>2019-04-06T06:46:46Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">mitstata3rdfloordreyfoos
Missiuro, Patrycja Ewelina
Map represents 3rd floor common area of the MIT Stata Center, Dreyfoos side. The doors that are encountered are marked with red circles. Map was acquired with Carmen (CMU robotic toolkit) running on B21 robot (circular footprint, radius=0.27 meters) equipped with laser range finders.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>ul_csis1_nonsimulatedsonardata</title>
<link href="https://hdl.handle.net/1721.1/62268" rel="alternate"/>
<author>
<name>O'Sullivan, Shane</name>
</author>
<id>https://hdl.handle.net/1721.1/62268</id>
<updated>2019-04-05T16:28:44Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">ul_csis1_nonsimulatedsonardata
O'Sullivan, Shane
This is sonar data taken during a real world run around the 1st floor of the CSIS building in the University of Limerick, Ireland using a Pioneer 1 robot. There are 4 files in the Zip file attached. UL_csis1_NonSimulatedSonarData.run contains the sonar data. MapViewer_Tutorial 4_Building_Maps_From_Sonar_Data.pdf explains the file format. UL_csis1_NonSimulatedSonarData_10mm.map contains the generated map in Carmen map format - http://www.cs.cmu.edu/~carmen, with a resolution of 10mm, meaning each grid cell is 10mm wide and 10mm high. This file can be opened using either Carmen or MapViewer, available at http://mapviewer.skynet.ie RobotRunFileHelper.h is a C++ class that can read and write the sonar data files. More information and sample usages of it can be found in the Tutorial4 zip file at http://mapviewer.skynet.ie The layout of the sonars is detailed in the .run file (this is also covered in the PDF). MapViewer can be used to build maps from this sonar data (at any resolution), and also to simply show the path taken by the robot in the environment.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>ap_hill_07b</title>
<link href="https://hdl.handle.net/1721.1/62267" rel="alternate"/>
<author>
<name>Howard, Andrew</name>
</author>
<id>https://hdl.handle.net/1721.1/62267</id>
<updated>2019-04-06T18:05:41Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">ap_hill_07b
Howard, Andrew
Experiments conducted at Fort AP Hill as part of the DARPA/IPTO SDR project. Monitor log files Wed 28 Jan 2004 14-36 Autonomous, followed by some manual intervention in attempt to find final target. Four robots, one breach point. Speed 0.40 m/sec. robot_intel0 = {'entrance' : 'A', 'ipose' : (0, 0, 0), 'fpose' : (-3.66, 0, 3.142)} robot_intel1 = {'entrance' : 'A', 'ipose' : (-1.22, +0.61, 0), 'fpose' : (-3.66, 0, 3.142)} robot_intel2 = {'entrance' : 'A', 'ipose' : (-2.44, 0, 0), 'fpose' : (-3.66, 0, 3.142)} robot_intel3 = {'entrance' : 'A', 'ipose' : (-3.66, +0.61, 0), 'fpose' : (-3.66, 0, 3.142)}
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>cmu_nsh_level_a</title>
<link href="https://hdl.handle.net/1721.1/62266" rel="alternate"/>
<author>
<name>Roy, Nick</name>
</author>
<id>https://hdl.handle.net/1721.1/62266</id>
<updated>2019-04-06T04:09:34Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">cmu_nsh_level_a
Roy, Nick
This map was constructed by GRACE, a B21, using a Sick PLS laser range finder. The two data files consist of a CARMEN log file, and a CARMEN map file. The map file was constructed from the log file, and is 10cm resolution. It is not a complete map of Level A, but a partial map around the RASL.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>albert-b-laser</title>
<link href="https://hdl.handle.net/1721.1/62265" rel="alternate"/>
<author>
<name>Stachniss, Cyrill</name>
</author>
<id>https://hdl.handle.net/1721.1/62265</id>
<updated>2019-04-08T07:12:48Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">albert-b-laser
Stachniss, Cyrill
B21r robot albert with 1 SICK PLS moving through the AIS-lab in building 079 at the University of Freiburg. Nothing special, but a small usefull dataset. The robot traveled around 210 m.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>Edmonton_Convention_Centre</title>
<link href="https://hdl.handle.net/1721.1/62264" rel="alternate"/>
<author>
<name>Roy, Nick</name>
</author>
<id>https://hdl.handle.net/1721.1/62264</id>
<updated>2019-04-06T05:53:05Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">Edmonton_Convention_Centre
Roy, Nick
This is the map used by GRACE to navigate from the registration desk to the site of the robot talk. The two data files consist of a CARMEN log file, and a CARMEN map file. The map file was constructed from the log file, and is 10cm resolution.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>ubremen-cartesium</title>
<link href="https://hdl.handle.net/1721.1/62263" rel="alternate"/>
<author>
<name>Stachniss, Cyrill</name>
</author>
<id>https://hdl.handle.net/1721.1/62263</id>
<updated>2019-04-06T10:50:22Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">ubremen-cartesium
Stachniss, Cyrill
Experiment Description ---------------------- Date : 2006-09-12/14 Where : Univ of Bremen, Cartesium building Robot-type : Pioneer2 with 1 LMS-Laser What : 1 robots (magnum), manually steered Remarks : 3 runs, same environment, different points in time, partially different objects around. File format : carmen logger format (ODOM, ROBOTLASER, RAWLASER), Remarks : - the files with the extention .log.gz are the raw logs - the files with the extention .gfs.log.gz are the logs corrected with our Rao-Blackwellized PF for SLAM, see: http://www.informatik.uni-freiburg.de/~stachnis/research/rbpfmapper description.txt lines 1-20/20 (END)
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>hospital_floorplan_fort_sam_houston</title>
<link href="https://hdl.handle.net/1721.1/62262" rel="alternate"/>
<author>
<name>Vaughan, Richard</name>
</author>
<id>https://hdl.handle.net/1721.1/62262</id>
<updated>2019-04-05T23:22:18Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">hospital_floorplan_fort_sam_houston
Vaughan, Richard
Bitmap map created from CAD model of the fourth floor of the hospital building at Fort Sam Houston, Texas. This building was the site of the DARPA TMR project demos in 1999, so several papers show data collected in this building. The floor layout is very big and varied, suitable for chopping up into interesting sections. The original AutoCAD model was obtained from $TMR_PARTICIPANT, then cleaned up by hand to remove door and other architectural symbols, toilet bowls, sinks, etc. This file is distributed with the Stage multi-robot simulator and has been used by several projects since 1999.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>DLR-Spatial_Cognition</title>
<link href="https://hdl.handle.net/1721.1/62261" rel="alternate"/>
<author>
<name>Hertzberg, Christopher</name>
</author>
<id>https://hdl.handle.net/1721.1/62261</id>
<updated>2019-04-06T20:13:41Z</updated>
<published>2011-02-24T00:00:00Z</published>
<summary type="text">DLR-Spatial_Cognition
Hertzberg, Christopher
There are two data sets in the package. One has artificial landmarks with white or black circles on the ground. The position of the landmark is then given as as relative 2D coordinate in the robots frame. The second data set is more difficult. The landmark are natural vertical lines in the image. Their position is only given as an angle in respect to the robot. The data sets are preprocessed and provide geometric data as measurement. No computer vision or similar is necessary.
A more detailed description as well as the raw data is available here: http://www.informatik.uni-bremen.de/agebv/en/DlrSpatialCognitionDataSet.
Algorithms to extract the data from the raw data are available on request. The format of the data set is documented in the file itself. The archive includes python code to import and display the data sets, which may be used for own derivations. C++ code to extract and optimize the data is available at SLoM on openslam.org.
</summary>
<dc:date>2011-02-24T00:00:00Z</dc:date>
</entry>
<entry>
<title>ut_austin_aces3</title>
<link href="https://hdl.handle.net/1721.1/62260" rel="alternate"/>
<author>
<name>Beeson, Patrick</name>
</author>
<id>https://hdl.handle.net/1721.1/62260</id>
<updated>2019-04-05T21:03:12Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">ut_austin_aces3
Beeson, Patrick
3rd floor of ACES building on UT Austin campus. The environment is basically a square with a cross that divides it into 4 smaller squares. The exploration first takes the outer loop, then divides the loop into 2 rectangles, then goes back to the middle and divides it again into the 4 squares. Robot performs autonomous midline following, rotates around at intersections for our topological implementations. The outer square is ~40 meters across. Max laser reading is 50 meters (anything over 49.5 is maxed out). Robot is an IRobot Magellan Pro. Action model of our robot has been experimentally validated as a Gaussian with mean (0,0,0). Sigmas are as follows: For every 1 meter forward, sigma_forawrd=0.4 For every 1 meter forward, sigma_normal=0.2 For every radian turned, sigma_rotate=0.8
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>department_diiga</title>
<link href="https://hdl.handle.net/1721.1/62259" rel="alternate"/>
<author>
<name>Pelusi, Gian Maria</name>
</author>
<id>https://hdl.handle.net/1721.1/62259</id>
<updated>2019-04-08T07:21:15Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">department_diiga
Pelusi, Gian Maria
This map was constructed by Pioneer 3dx , using a Sick LMS laser range finder. This is a log file of an indoor enviroment ,Department of DIIGA at Engeneering University in Ancona.Size is aproximately 47x47 m. The four data files consist of a CARMEN log file,a CARMEN clf file corrected with command vasco-tiny , a CARMEN map file and an image of the map. log file: Department_DIIGA.log.tar.gz clf file: Department_DIIGA.clf.tar.gz map file: Department_DIIGA.map.tar.gz image file : Department_DIIGA.jpg.tar.gz
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>comparison_of_self-localization_methods_continued</title>
<link href="https://hdl.handle.net/1721.1/62255" rel="alternate"/>
<author>
<name>Gutmann, Steffen</name>
</author>
<id>https://hdl.handle.net/1721.1/62255</id>
<updated>2019-04-06T07:33:30Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">comparison_of_self-localization_methods_continued
Gutmann, Steffen
Comparison of Self-Localization Methods Continued ================================================= This archive contains data logs and evaluation tools, Dieter Fox and me used for performing the self-locaization experiments reported in our paper at IROS 2002. Data Log Files -------------- The archive contains one directory 'dlogs' with the following data log files: o normal/dlog.dat (base log file), o sparse.*/dlog.dat (log files under different level of sparseness), o noise.*/dlog.dat (log files under different level of outliers), and o displace/dlog.dat (log file with parts removed to simulate kid-napping). These log files were generated from the same base log file. See our paper [Gutmann and Fox, IROS 2002] for understanding what has been done to the indiviual logs. The format of each dlog.dat file is as follows. Each line contains a position estimate from odometry and observations obs: frame x y theta num ( id1:id2 range bearing ) ... where: - frame is a timestamp (1 sec = 125 frames), - x, y, theta is the robot pose estimated by odometry and measured in mm and deg, - num is the number of landmarks seen, - id1:id2 gives the id of the landmark seen, - range and bearing are from the current robot pose (kinematic chain of head is already included) and are measured in mm and deg. The reference point on the robot is at the neck joint connecting robot body and head. You also need the following information about the landmark positions: cyan:magenta -1500 -1000 magenta:cyan -1500 1000 magenta:green 0 -1000 green:magenta 0 1000 yellow:magenta 1500 -1000 magenta:yellow 1500 1000 and 0 -&gt; green 1 -&gt; magenta 2 -&gt; yellow 3 -&gt; blue The robot started in the center of the field facing in positive x direction. It then cycled in an 8-shaped path through the following positions: (500 -500), (500, 0), (0, 0), (-1000, 0), (-1000, -500). At each of these positions the operator pressed a button on which a 'mark' was written to the data log. Potentially there are errors in the ground truth (joysticking the robot exactly on a spot is difficult, observing it is exactly on a spot contains errors, and there can be a short time delay until the mark is written to the log). See fieldSetup.gif for a visualization of landmark and marker positions. Evaluation ---------- Basically, you can completely decide by your own how to evaluate your results. Here is how we did it. If your localization program outputs the pose of the robot at each mark in the following format: &lt; method-name &gt; x y th where x, y are in mm and the in deg, then you can use the accuracy.sh script for computing the distance to the ground truth locations and the mean_confidence program for obtaining mean and confidence of your estimates. E.g. for MLEKF we used: accuracy.sh MLEKF &lt; pose.log | mean_confidence Evaluation for kid-napping is a bit different. We used the output of one localization method as a reference path when processing the base log file (e.g. we used the SRL output but you are welcome to provide your own if you feel your results on the base log file are better). Your localization program should then output the robot pose at each time step in the following format: x[&lt; frame &gt;]=&lt; x &gt; y[&lt; frame &gt;]=&lt; y &gt; th[&lt; frame &gt;]=&lt; th &gt; for example: x[30048]=-102.502917; y[30048]=-4.598352; th[30048]=-105.912977; Your localization program should als copy the 'mark' lines to this output. You can then use the 'recoverTime' utility for computing the number of seconds your method needs for recovering from kid-napping: recoverTime ../reference-log/pose.log.SRL &lt; allpose.log | mean_confidence You find our reference pose log, scripts and utils in the evaluation folder. Papers ------ J.-S. Gutmann, W. Burgard, D. Fox, and K. Konolige. An Experimental Comparison of Localization Methods, International Conference on Intelligent Robots and Systems (IROS'98), Victoria, Canada, October 1998. J.-S. Gutmann and D. Fox, An Experimental Comparison of Localization Methods Continued, in: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'02), Lausanne, Switzerland, October 2002. S. Kristensen and P. Jensfelt. An Experimental Comparison of Localisation Methods, the MHL Sessions, in: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'03), 2003. Good luck! Steffen Gutmann, 6.5.2004
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>aic2_sonar_data_simulated_noloc</title>
<link href="https://hdl.handle.net/1721.1/62254" rel="alternate"/>
<author>
<name>O'Sullivan, Shane</name>
</author>
<id>https://hdl.handle.net/1721.1/62254</id>
<updated>2019-04-08T08:11:12Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">aic2_sonar_data_simulated_noloc
O'Sullivan, Shane
This is sonar data taken during a simulated run around the AIC environment provided with the Saphira simulator using a Pioneer 1 robot. There are 4 files in the Zip file attached. aic2_SimulatedSonarData_NoError_NoLoc.run contains the sonar data. MapViewer_Tutorial 4_Building_Maps_From_Sonar_Data.pdf explains the file format. aic2_SimulatedSonarData_NoError_NoLoc_1cm.map contains the generated map in Carmen map format - http://www.cs.cmu.edu/~carmen, with a resolution of 10mm, meaning each grid cell is 10mm wide and 10mm high. This file can be opened using either Carmen or MapViewer, available at http://mapviewer.skynet.ie RobotRunFileHelper.h is a C++ class that can read and write the sonar data files. More information and sample usages of it can be found in the Tutorial4 zip file at http://mapviewer.skynet.ie The layout of the sonars is detailed in the .run file (this is also covered in the PDF). MapViewer can be used to build maps from this sonar data (at any resolution), and also to simply show the path taken by the robot in the environment. For the test run, odometry error was turned off, meaning localisation was not needed. However, error in the sonar readings is still present.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>intel_oregon</title>
<link href="https://hdl.handle.net/1721.1/62253" rel="alternate"/>
<author>
<name>Batalin, Maxim</name>
</author>
<id>https://hdl.handle.net/1721.1/62253</id>
<updated>2019-04-08T08:11:11Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">intel_oregon
Batalin, Maxim
These files contain the raw sensor data acquired by a single robot during a tour of the part of the Intel Lab in Hillsboro, Oregon. The robot was equipped as follows: * A Pioneer2DX (for odometry) * A Sick LMS 200 (laser range finder); mounted 8cm forward from the center of the robot, facing forwards.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>stanford-gates1</title>
<link href="https://hdl.handle.net/1721.1/62252" rel="alternate"/>
<author>
<name>Gerkey, Brian</name>
</author>
<id>https://hdl.handle.net/1721.1/62252</id>
<updated>2019-04-08T07:13:10Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">stanford-gates1
Gerkey, Brian
This is a 30-minute tour through the 1st floor of Stanford's Gates Computer Science Building. The robot is a Pioneer 2DX with a forward-pointing SICK LMS 200 mounted at or about the robot's center of rotation. The laser was running at high speed (75Hz scans) in the 10 mm, 1 degree mode.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
<entry>
<title>sdr_site_b</title>
<link href="https://hdl.handle.net/1721.1/62245" rel="alternate"/>
<author>
<name>Howard, Andrew</name>
</author>
<id>https://hdl.handle.net/1721.1/62245</id>
<updated>2019-04-07T03:35:16Z</updated>
<published>2010-12-07T00:00:00Z</published>
<summary type="text">sdr_site_b
Howard, Andrew
These files contain the raw sensor data acquired by a single robot during a tour of the SDR site B. The robot was equipped as follows: * Pioneer2DX (odometry) * Sick LMS 200 (laser range finder); mounted 8cm forward from the center of the robot, facing forwards. The robot was teleoperated by a human operator.
</summary>
<dc:date>2010-12-07T00:00:00Z</dc:date>
</entry>
</feed>
