Arty: Expressive timbre transfer using articulation detection for guitar
Author(s)
Franjou, Sebastian
DownloadThesis PDF (2.150Mb)
Advisor
Egozy, Eran
Terms of use
Metadata
Show full item recordAbstract
In this work, we propose a novel approach to timbre transfer. Timbre transfer is the transformation of an instrument’s timbre to match the timbre of another instrument while preserving key musical information like pitch and loudness. Current attempts tend to rely either on MIDI pitch and velocity information, or on Deep Learning networks. The former approach requires discarding a lot of information and hence suffers from a loss of expressivity, while the latter results in expressive but unstable and difficult to tune systems.
Arty aims to address this problem by adding expression data to the collected MIDI. By detecting instrument-specific playing techniques called articulations, and transcribing these articulations as MIDI data, Arty attempts to provide an expressive yet flexible alternative to the methods above for timbre transfer from guitar. The use of MIDI allows for integration with other music performance systems and doesn’t impose a particular sound synthesis method.
We created a new dataset, the Arty dataset, and used it in conjunction with existing data to train a model to classify right-hand and left-hand guitar playing techniques. We implemented a website as a user interface to allow users to easily convert their guitar playing to MIDI. Arty achieved fairly high accuracy on the dataset, but the user study showed that Arty’s real world accuracy is much lower, in part because real-world data is different from and more diverse than our dataset. The user study did however reveal a strong interest for such a system from advanced virtual instruments users.
Date issued
2022-09Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology