Show simple item record

dc.contributor.authorChin, Sam
dc.contributor.authorFitz-Gibbon, Emmie
dc.contributor.authorHuang, Bingjian
dc.contributor.authorParadiso, Joseph
dc.date.accessioned2025-11-26T17:12:02Z
dc.date.available2025-11-26T17:12:02Z
dc.date.issued2025-10-22
dc.identifier.isbn979-8-4007-0676-9
dc.identifier.urihttps://hdl.handle.net/1721.1/164077
dc.descriptionASSETS ’25, Denver, CO, USAen_US
dc.description.abstractAge-related hearing loss is often caused by cochlear hair cell degradation. This creates a challenge for hearing aids, which rely on sound amplification. Once hearing ability in a specific frequency is lost, amplification alone provides little benefit. Previous haptic systems have tried to solve this with complete sensory substitution, converting audio signals like phonemes to tactile patterns. However, these systems require significant amount of time to learn, and induce high cognitive load in haptic perception. Our system, HapticHearing, takes an alternative approach of leveraging a user’s residual hearing and complementing it with tactile feedback. We present a custom multi-actuator haptic device, designed to translate phonemic information from speech into tactile patterns that are customized to a user’s hearing loss and speech perception abilities. The system consists of a microphone for speech capture, four-band energy envelope extraction with vowel embedding, a custom USB-to-haptic driver PCB, and wearable devices containing eight vibrotactile actuators that deliver personalized tactile feedback based on the user’s audiogram. Psychophysical validation (n=9) showed neck-worn devices achieved better spatial localization (67% vs 53%) while while bracelet and necklace devices had lower detection thresholds than over-ear (thresholds 0.09 vs 0.18).en_US
dc.publisherACM|The 27th International ACM SIGACCESS Conference on Computers and Accessibilityen_US
dc.relation.isversionofhttps://doi.org/10.1145/3663547.3759754en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleHapticHearing: A Haptic Feedback System for Complementing Auditory Speech Perception for Mild-to-Moderate Hearing Lossen_US
dc.typeArticleen_US
dc.identifier.citationSam Chin, Emmie Fitz-Gibbon, Bingjian Huang, and Joseph A. Paradiso. 2025. HapticHearing: A Haptic Feedback System for Complementing Auditory Speech Perception for Mild-to-Moderate Hearing Loss. In The 27th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’25), October 26–29, 2025, Denver, CO, USA. ACM, New York, NY, USA, 5 pages.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Media Laboratoryen_US
dc.identifier.mitlicensePUBLISHER_POLICY
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2025-11-01T07:47:21Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2025-11-01T07:47:21Z
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record