MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Fusion of remote vision and on-board acceleration data for the vibration estimation of large space structures

Author(s)
Bilton, Amy M. (Amy Marlou)
Thumbnail
DownloadFull printable version (8.202Mb)
Other Contributors
Massachusetts Institute of Technology. Dept. of Aeronautics and Astronautics.
Advisor
Steven Dubowsky.
Terms of use
M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission. http://dspace.mit.edu/handle/1721.1/7582
Metadata
Show full item record
Abstract
Future space structures such as solar power stations and telescopes are expected to be very large. These structures will require on-orbit construction. Due to the risks and costs of human extravehicular work, teams of robots will be essential for the on-orbit assembly of the large space structures. There are a number of technical challenges presented by such robotic construction. The structures will need to be made of lightweight materials and will be very flexible. Autonomous robots will require information about the vibrations of the flexible structures and their dynamic parameters in order to perform the construction efficiently. Often models of the structures are imperfect, therefore the magnitude of the vibrations of the structure must be estimated on-orbit. This thesis presents a method for estimating the shape and dynamic parameters of a vibrating large space structure. This technique is a cooperative sensing approach using remote free-flying robot observers equipped with vision sensors and structure-mounted accelerometers. This approach exploits the complementary nature of the two types of sensors.
 
(cont.) Vision sensors are able to measure structure deflections at a high spatial frequency but are bandwidth limited. Accelerometers are able to make measurements at high temporal frequency, but are sparsely located on the structure. The fused estimation occurs in three steps. First, the vision data is condensed in a modal decomposition that results in coarse estimates of modal coefficients. In the second step, the coarse estimates of the modal coefficients obtained from vision data are fused with the accelerometer measurements in a multi-rate nonlinear Kalman filter, resulting in a refined estimate of the modal coefficients and dynamic properties of the structure. In the final step, the estimated modal coefficients are combined with the mode shapes to provide a shape estimate of the entire structure. Simulation and experimental results demonstrate that the performance of this fused estimation approach is superior to the performance achieved when using only a single type of sensor.
 
Description
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2006.
 
Includes bibliographical references (leaves 81-84).
 
Date issued
2006
URI
http://hdl.handle.net/1721.1/35580
Department
Massachusetts Institute of Technology. Department of Aeronautics and Astronautics
Publisher
Massachusetts Institute of Technology
Keywords
Aeronautics and Astronautics.

Collections
  • Graduate Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.