Show simple item record

dc.contributor.advisorWojciech Matusik and Tomas Palacios.en_US
dc.contributor.authorLuo, Yiyue(Computer scientist)Massachusetts Institute of Technology.en_US
dc.contributor.otherMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.en_US
dc.date.accessioned2020-11-24T17:31:58Z
dc.date.available2020-11-24T17:31:58Z
dc.date.copyright2020en_US
dc.date.issued2020en_US
dc.identifier.urihttps://hdl.handle.net/1721.1/128628
dc.descriptionThesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, May, 2020en_US
dc.descriptionCataloged from student-submitted PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (pages 59-63).en_US
dc.description.abstractHumans perform complex tasks in the real world thanks to rich and constant tactile perceptual input. Being able to record such tactile data would allow scientists from various disciplines to study human activities more fundamentally and quantitatively. Moreover, capturing large and diverse datasets on human-environment interactions and coupling them with machine learning models would allow the development of future intelligent robotic systems that mimic human behavior. Here, we present a textile-based tactile learning platform that enables researchers to record, monitor, and learn human activities and the associated interactions. Realized with inexpensive piezoresistive fibers (0.2 USD/m) and automated machine knitting, our functional textiles offer dense coverage (> 1000 sensors) over large complex surfaces (> 2000 cm2). Further, we leverage machine learning for sensing correction, ensuring that our system is robust against potential variations from individual receptors. To validate the capability of our sensor, we capture diverse human-environment interactions (> 1,000,000 tactile frames). We demonstrate that machine learning techniques can be used with our data to classify human activities, predict whole-body poses, and discover novel motion signatures. This work opens up new possibilities in wearable electronics, healthcare, manufacturing, and robotics.en_US
dc.description.statementofresponsibilityby Yiyue Luo.en_US
dc.format.extent63 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectElectrical Engineering and Computer Science.en_US
dc.titleDiscovering the patterns of human-environment interactions using scalable functional textilesen_US
dc.typeThesisen_US
dc.description.degreeS.M.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.identifier.oclc1204268806en_US
dc.description.collectionS.M. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Scienceen_US
dspace.imported2020-11-24T17:31:57Zen_US
mit.thesis.degreeMasteren_US
mit.thesis.departmentEECSen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record