Show simple item record

dc.contributor.authorChin, Sam
dc.contributor.authorFitz-Gibbon, Emmie
dc.contributor.authorHuang, Bingjian
dc.contributor.authorTims, Carter
dc.contributor.authorOrzech, Gabrielle
dc.contributor.authorThoo, Yong-Joon
dc.contributor.authorParadiso, Joseph
dc.date.accessioned2025-10-02T20:06:45Z
dc.date.available2025-10-02T20:06:45Z
dc.date.issued2025-09-27
dc.identifier.isbn979-8-4007-2036-9
dc.identifier.urihttps://hdl.handle.net/1721.1/162879
dc.descriptionUIST Adjunct ’25, Busan, Republic of Koreaen_US
dc.description.abstractExisting multi-actuator vibrotactile systems often require external hardware such as sound cards and haptic amplifiers, which limits portability and creates complexity for non-technical users. This presents a significant barrier for researchers and designers in fields like human factors and healthcare. We present Sound2Haptic, an vibrotactile toolkit that integrates a sound card and haptic amplifiers into a single device. The toolkit connects to laptops, phones, and XR headsets, enabling portable eight-channel multi-actuator interaction accessible to non-technical users. The toolkit features a novel mechanical design that reduces cross-actuator interference and enables form factor customization. We demonstrate the toolkit’s functional efficacy through psychophysical evaluation across three form factors, and its ease of use through three case studies: (1) a clinical application for tinnitus research (2) a human factors study on speech prosody conducted with human factors researcher, and (3) an exploration of spatial neglect rehabilitation using XR and haptics.en_US
dc.publisherACM|The 38th Annual ACM Symposium on User Interface Software and Technologyen_US
dc.relation.isversionofhttps://doi.org/10.1145/3746058.3758993en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleSound2Haptic: A Toolkit for Portable Multi-Channel Haptic Integration Across Multiple Form Factors and Devicesen_US
dc.typeArticleen_US
dc.identifier.citationSam Chin, Emmie Fitz-Gibbon, Bingjian Huang, Carter Tims, Gabrielle Orzech, Yong-Joon Thoo, and Joseph A. Paradiso. 2025. Sound2Haptic: A Toolkit for Portable Multi-Channel Haptic Integration Across Multiple Form Factors and Devices. In Adjunct Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology (UIST Adjunct '25). Association for Computing Machinery, New York, NY, USA, Article 54, 1–4.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Media Laboratoryen_US
dc.identifier.mitlicensePUBLISHER_POLICY
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2025-10-01T07:50:56Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2025-10-01T07:50:56Z
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record