Personalized Animations for Affective Feedback: Generative AI Helps to Visualize Skin Conductance
Author(s)
Scheirer, Jocelyn; Picard, Rosalind; Cantrell, Aubrey
Download3746270.3760237.pdf (827.8Kb)
Publisher with Creative Commons License
Publisher with Creative Commons License
Creative Commons Attribution
Terms of use
Metadata
Show full item recordAbstract
Biofeedback interfaces traditionally rely on abstract visualizations,
tones, or haptics to convey physiological states—but these often lack
personal relevance, emotional salience, and engagement. In this
paper, we present a novel system that bridges wearable sensing and
generative AI to create real-time, personalized animated
biofeedback experiences. Users describe emotionally meaningful
objects or scenes to a language model in our system, which outputs
generate customized Processing animations. These animations are
then dynamically driven by electrodermal activity (EDA) signals
from a wrist sensor. We co-design and evaluate the system with
autistic adults, many of whom have unique “special interests” that
are likely to engage them more than a one-sized-fits-all
visualization. Many of these individuals also have difficulty with
interoception -- feeling or sensing their own internal and
physiological state changes. We built this tool to transform passive
physiological monitoring into an interactive multimedia
experience, where the visual representation of the body is authored
by the user. We introduce a prompt-engineered GPT-based
interface that streamlines code generation, sensor mapping, and
iterative refinement, requiring no prior coding expertise. The
technical pipeline we built includes signal filtering, dynamic
parameter mapping, and natural language-based customization—
delivering a real-time, visually immersive feedback loop. We report
on initial case studies with 12 autistic adults using the system,
which highlight both the expressive potential and individual
variability of user responses, reinforcing the need for adaptable
multimedia frameworks in health technologies. By merging realtime physiological data with generative animation and natural
language interaction, this work expands the creative frontier of
personalized affective biofeedback. We also address ethical
challenges arising from using AI with physiological sensors.
Description
MRAC '25, October 27–28, 2025, Dublin, Ireland
Date issued
2025-10-26Department
Massachusetts Institute of Technology. Media LaboratoryPublisher
ACM|Proceedings of the 3rd International Workshop on Multimodal and Responsible Affective Computing
Citation
Jocelyn Scheirer, Rosalind Picard, and Aubrey Cantrell. 2025. Personalized Animations for Affective Feedback: Generative AI Helps to Visualize Skin Conductance. In Proceedings of the 3rd International Workshop on Multimodal and Responsible Affective Computing (MRAC '25). Association for Computing Machinery, New York, NY, USA, 146–151.
Version: Final published version
ISBN
979-8-4007-2052-9