Show simple item record

dc.contributor.authorPiovarči, Michal
dc.contributor.authorLevin, David I. W.
dc.contributor.authorRebello, Jason
dc.contributor.authorChen, Desai
dc.contributor.authorĎurikovič, Roman
dc.contributor.authorPfister, Hanspeter
dc.contributor.authorMatusik, Wojciech
dc.contributor.authorDidyk, Piotr
dc.date.accessioned2017-10-03T19:05:41Z
dc.date.available2017-10-03T19:05:41Z
dc.date.issued2016-07
dc.identifier.issn0730-0301
dc.identifier.urihttp://hdl.handle.net/1721.1/111686
dc.description.abstractEveryone, from a shopper buying shoes to a doctor palpating a growth, uses their sense of touch to learn about the world. 3D printing is a powerful technology because it gives us the ability to control the haptic impression an object creates. This is critical for both replicating existing, real-world constructs and designing novel ones. However, each 3D printer has different capabilities and supports different materials, leaving us to ask: How can we best replicate a given haptic result on a particular output device? In this work, we address the problem of mapping a real-world material to its nearest 3D printable counterpart by constructing a perceptual model for the compliance of nonlinearly elastic objects. We begin by building a perceptual space from experimentally obtained user comparisons of twelve 3D-printed metamaterials. By comparing this space to a number of hypothetical computational models, we identify those that can be used to accurately and efficiently evaluate human-perceived differences in nonlinear stiffness. Furthermore, we demonstrate how such models can be applied to complex geometries in an interaction-aware way where the compliance is influenced not only by the material properties from which the object is made but also its geometry. We demonstrate several applications of our method in the context of fabrication and evaluate them in a series of user experiments.en_US
dc.language.isoen_US
dc.publisherAssociation for Computing Machinery (ACM)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1145/2897824.2925885en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceOther univ. web domainen_US
dc.titleAn interaction-aware, perceptual model for non-linear elastic objectsen_US
dc.typeArticleen_US
dc.identifier.citationPiovarči, Michal et al. “An Interaction-Aware, Perceptual Model for Non-Linear Elastic Objects.” ACM Transactions on Graphics 35, 4 (July 2016): 1–13 © 2016 The Author(s)en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.mitauthorChen, Desai
dc.contributor.mitauthorMatusik, Wojciech
dc.relation.journalACM Transactions on Graphicsen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dspace.orderedauthorsPiovarči, Michal; Levin, David I. W.; Rebello, Jason; Chen, Desai; Ďurikovič, Roman; Pfister, Hanspeter; Matusik, Wojciech; Didyk, Piotren_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0003-2336-6235
dc.identifier.orcidhttps://orcid.org/0000-0003-0212-5643
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record