Show simple item record

dc.contributor.authorXiao, Bei
dc.contributor.authorBi, Wenyan
dc.contributor.authorJia, Xiaodan
dc.contributor.authorWei, Hanhan
dc.contributor.authorAdelson, Edward H
dc.date.accessioned2017-07-11T14:48:18Z
dc.date.available2017-07-11T14:48:18Z
dc.date.issued2016-02
dc.date.submitted2015-06
dc.identifier.issn1534-7362
dc.identifier.urihttp://hdl.handle.net/1721.1/110625
dc.description.abstractHumans can often estimate tactile properties of objects from vision alone. For example, during online shopping, we can often infer material properties of clothing from images and judge how the material would feel against our skin. What visual information is important for tactile perception? Previous studies in material perception have focused on measuring surface appearance, such as gloss and roughness, and using verbal reports of material attributes and categories. However, in real life, predicting tactile properties of an object might not require accurate verbal descriptions of its surface attributes or categories. In this paper, we use tactile perception as ground truth to measure visual material perception. Using fabrics as our stimuli, we measure how observers match what they see (photographs of fabric samples) with what they feel (physical fabric samples). The data shows that color has a significant main effect in that removing color significantly reduces accuracy, especially when the images contain 3-D folds. We also find that images of draped fabrics, which revealed 3-D shape information, achieved better matching accuracy than images with flattened fabrics. The data shows a strong interaction between color and folding conditions on matching accuracy, suggesting that, in 3-D folding conditions, the visual system takes advantage of chromatic gradients to infer tactile properties but not in flattened conditions. Together, using a visual–tactile matching task, we show that humans use folding and color information in matching the visual and tactile properties of fabrics.en_US
dc.description.sponsorshipGoogle (Firm) (Faculty Grant)en_US
dc.language.isoen_US
dc.publisherAssociation for Research in Vision and Opthalmologyen_US
dc.relation.isversionofhttp://dx.doi.org/10.1167/16.3.34en_US
dc.rightsCreative Commons Attribution-NonCommercial-NoDerivs Licenseen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/en_US
dc.sourceJournal of Visionen_US
dc.titleCan you see what you feel? Color and folding properties affect visual–tactile material discrimination of fabricsen_US
dc.typeArticleen_US
dc.identifier.citationXiao, Bei et al. “Can You See What You Feel? Color and Folding Properties Affect Visual–tactile Material Discrimination of Fabrics.” Journal of Vision 16.3 (2016): 34. © 2015 Association for Research in Vision and Ophthalmologyen_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Brain and Cognitive Sciencesen_US
dc.contributor.mitauthorJia, Xiaodan
dc.contributor.mitauthorWei, Hanhan
dc.contributor.mitauthorAdelson, Edward H
dc.relation.journalJournal of Visionen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsXiao, Bei; Bi, Wenyan; Jia, Xiaodan; Wei, Hanhan; Adelson, Edward H.en_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0003-2222-6775
mit.licensePUBLISHER_CCen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record