Systems of Visualization for Musical Futures
Author(s)
Naseck, Perry
DownloadThesis PDF (49.11Mb)
Advisor
Paradiso, Joseph A.
Terms of use
Metadata
Show full item recordAbstract
This thesis investigates how large-scale visual systems can communicate the presence, agency, and foresight of improvising musical agents–human and AI–during live performance. We propose a framework for manifesting AI collaborators on stage through five principles: musical transparency, live improvisational reactivity, demonstrated virtuosity, communication for collaboration, and visual fit. Two public performances operationalize these ideas: an addressable-light sculpture that renders harmonic space, and a stage-sized kinetic sculpture built from novel, low-cost Generic Pan Tilt fixtures that visualize the AI’s planned “musical futures.” The latter combines a real-time, MIDI-conditioned, Transformer-based hand-motion model with deterministic, pattern-based mappings that signal states such as resting and regeneration. Audience surveys indicate that viewers perceived links between musical turns and kinetic gestures while requesting clearer explanatory cues. We document the open-source hardware, firmware, and control protocols of the Generic Pan Tilt platform and reflect on design tradeoffs for accessibility, reliability, and expressivity. Finally, we outline a real-time analysis toolchain–motif detection, parallelism, and continuous energy/tension estimators–that emits OSC triggers for lighting, media, kinetic, and spatial-audio systems, enabling reactive shows beyond timecode. Together, these systems advance performable visualizations of human-improvised and AI-driven musical futures.
Date issued
2025-09Department
Program in Media Arts and Sciences (Massachusetts Institute of Technology)Publisher
Massachusetts Institute of Technology