| dc.contributor.author | Chin, Sam | |
| dc.contributor.author | Fitz-Gibbon, Emmie | |
| dc.contributor.author | Huang, Bingjian | |
| dc.contributor.author | Tims, Carter | |
| dc.contributor.author | Orzech, Gabrielle | |
| dc.contributor.author | Thoo, Yong-Joon | |
| dc.contributor.author | Paradiso, Joseph | |
| dc.date.accessioned | 2025-10-02T20:06:45Z | |
| dc.date.available | 2025-10-02T20:06:45Z | |
| dc.date.issued | 2025-09-27 | |
| dc.identifier.isbn | 979-8-4007-2036-9 | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/162879 | |
| dc.description | UIST Adjunct ’25, Busan, Republic of Korea | en_US |
| dc.description.abstract | Existing multi-actuator vibrotactile systems often require external hardware such as sound cards and haptic amplifiers, which limits portability and creates complexity for non-technical users. This presents a significant barrier for researchers and designers in fields like human factors and healthcare. We present Sound2Haptic, an vibrotactile toolkit that integrates a sound card and haptic amplifiers into a single device. The toolkit connects to laptops, phones, and XR headsets, enabling portable eight-channel multi-actuator interaction accessible to non-technical users. The toolkit features a novel mechanical design that reduces cross-actuator interference and enables form factor customization. We demonstrate the toolkit’s functional efficacy through psychophysical evaluation across three form factors, and its ease of use through three case studies: (1) a clinical application for tinnitus research (2) a human factors study on speech prosody conducted with human factors researcher, and (3) an exploration of spatial neglect rehabilitation using XR and haptics. | en_US |
| dc.publisher | ACM|The 38th Annual ACM Symposium on User Interface Software and Technology | en_US |
| dc.relation.isversionof | https://doi.org/10.1145/3746058.3758993 | en_US |
| dc.rights | Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. | en_US |
| dc.source | Association for Computing Machinery | en_US |
| dc.title | Sound2Haptic: A Toolkit for Portable Multi-Channel Haptic Integration Across Multiple Form Factors and Devices | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Sam Chin, Emmie Fitz-Gibbon, Bingjian Huang, Carter Tims, Gabrielle Orzech, Yong-Joon Thoo, and Joseph A. Paradiso. 2025. Sound2Haptic: A Toolkit for Portable Multi-Channel Haptic Integration Across Multiple Form Factors and Devices. In Adjunct Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology (UIST Adjunct '25). Association for Computing Machinery, New York, NY, USA, Article 54, 1–4. | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Media Laboratory | en_US |
| dc.identifier.mitlicense | PUBLISHER_POLICY | |
| dc.eprint.version | Final published version | en_US |
| dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
| eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
| dc.date.updated | 2025-10-01T07:50:56Z | |
| dc.language.rfc3066 | en | |
| dc.rights.holder | The author(s) | |
| dspace.date.submission | 2025-10-01T07:50:56Z | |
| mit.license | PUBLISHER_POLICY | |
| mit.metadata.status | Authority Work and Publication Information Needed | en_US |