MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Context-based motion retrieval using vector space model

Author(s)
Zhang, Zhunping
Thumbnail
DownloadFull printable version (11.15Mb)
Other Contributors
Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science.
Advisor
Jovan Popović.
Terms of use
M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission. http://dspace.mit.edu/handle/1721.1/7582
Metadata
Show full item record
Abstract
Motion retrieval is the problem of retrieving highly relevant motions in a timely manner. The principal challenge is to characterize the similarity between two motions effectively, which is tightly related to the gap between the motion data's representation and its semantics. Our approach uses vector space model to measure the similarities among motions, which are made discrete using the vocabulary technique and transformation invariant using the relational feature model. In our approach, relational features are first extracted from motion data. then such features are clustered into a motion vocabulary. Finally motions are turned into bag of words and retrieved using vector-space model. We implemented this new system and tested it on two benchmark databases composed of real world data. Two existing methods, the dynamics time warping method and the binary feature method, are implemented for comparison. The results shows that our system are comparable in effectiveness with the dynamic time warping system, but runs 100 to 400 times faster. In comparison to retrieval with binary features, it is just as fast but more accurate and practical.The success of our system points to several additional improvements. Our experiments reveal that the velocity features improve the relevance of retrieved results, but more effort should be dedicated to determining the best set of features for motion retrieval. The same experiments should be performed on large databases and in particular to test how this performance generalizes on test motions outside the original database. The alternative vocabulary organizations, such as vocabulary tree and random forest, should be investigated because they can improve our approach by providing more flexibility to the similarity scoring model and reducing the approximation error of the vocabulary. Because the bag of words model ignores the temporal ordering of key features, a wavelet model should also be explored as a mechanism to encode features across different time scales.
 
(cont.) The alternative vocabulary organizations, such as vocabulary tree and random forest, should be investigated because they can improve our approach by providing more flexibility to the similarity scoring model and reducing the approximation error of the vocabulary. Because the bag of words model ignores the temporal ordering of key features, a wavelet model should also be explored as a mechanism to encode features across different time scales.
 
Description
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2008.
 
Includes bibliographical references (p. 87-89).
 
Date issued
2008
URI
http://hdl.handle.net/1721.1/45856
Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Publisher
Massachusetts Institute of Technology
Keywords
Electrical Engineering and Computer Science.

Collections
  • Graduate Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.