Show simple item record

dc.contributor.authorZhang, Zhengdong
dc.contributor.authorHenderson, Theia
dc.contributor.authorKaraman, Sertac
dc.contributor.authorSze, Vivienne
dc.date.accessioned2021-10-15T19:48:32Z
dc.date.available2021-10-15T19:48:32Z
dc.date.issued2020
dc.identifier.urihttps://hdl.handle.net/1721.1/133010
dc.description.abstract© The Author(s) 2020. Exploration tasks are embedded in many robotics applications, such as search and rescue and space exploration. Information-based exploration algorithms aim to find the most informative trajectories by maximizing an information-theoretic metric, such as the mutual information between the map and potential future measurements. Unfortunately, most existing information-based exploration algorithms are plagued by the computational difficulty of evaluating the Shannon mutual information metric. In this article, we consider the fundamental problem of evaluating Shannon mutual information between the map and a range measurement. First, we consider 2D environments. We propose a novel algorithm, called the fast Shannon mutual information (FSMI). The key insight behind the algorithm is that a certain integral can be computed analytically, leading to substantial computational savings. Second, we consider 3D environments, represented by efficient data structures, e.g., an OctoMap, such that the measurements are compressed by run-length encoding (RLE). We propose a novel algorithm, called FSMI-RLE, that efficiently evaluates the Shannon mutual information when the measurements are compressed using RLE. For both the FSMI and the FSMI-RLE, we also propose variants that make different assumptions on the sensor noise distribution for the purpose of further computational savings. We evaluate the proposed algorithms in extensive experiments. In particular, we show that the proposed algorithms outperform existing algorithms that compute Shannon mutual information as well as other algorithms that compute the Cauchy–Schwarz quadratic mutual information (CSQMI). In addition, we demonstrate the computation of Shannon mutual information on a 3D map for the first time.en_US
dc.language.isoen
dc.publisherSAGE Publicationsen_US
dc.relation.isversionof10.1177/0278364920921941en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleFSMI: Fast computation of Shannon mutual information for information-theoretic mappingen_US
dc.typeArticleen_US
dc.identifier.citationZhang, Zhengdong, Henderson, Theia, Karaman, Sertac and Sze, Vivienne. 2020. "FSMI: Fast computation of Shannon mutual information for information-theoretic mapping." International Journal of Robotics Research, 39 (9).
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Aeronautics and Astronauticsen_US
dc.contributor.departmentMassachusetts Institute of Technology. Microsystems Technology Laboratoriesen_US
dc.relation.journalInternational Journal of Robotics Researchen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2021-04-08T14:13:52Z
dspace.orderedauthorsZhang, Z; Henderson, T; Karaman, S; Sze, Ven_US
dspace.date.submission2021-04-08T14:13:53Z
mit.journal.volume39en_US
mit.journal.issue9en_US
mit.licenseOPEN_ACCESS_POLICY
mit.metadata.statusPublication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record