MyoSpat: a system for manipulating sound and light projections through hand gestures
Di Donato, Balandino and Dooley, James (2017) MyoSpat: a system for manipulating sound and light projections through hand gestures. In: Workshop on Intelligent Music Production 2017, 15th September, 2017, Salford, Manchester.
|
Text
WIMP2017_MYOSPAT_ A SYSTEM FOR MANIPULATING SOUND AND LIGHT PROJECTIONS THROUGH HAND GESTURES.pdf - Accepted Version Download (193kB) |
Abstract
MyoSpat is an interactive audio-visual system that aims to
augment musical performances by empowering musicians
and allowing them to directly manipulate sound and light
projections through hand gestures. We present the second
iteration of the system which draws from research findings
that emerged from an evaluation of the first system.
MyoSpat 2 is designed and developed using the Myo ges-
ture control armband as input device and Pure Data (Pd) as
gesture recognition and audio-visual engine. The system is
informed by human-computer interaction (HCI) principles:
tangible computing and embodied, sonic and music inter-
action design (MiXD). This paper reports a description of
the system and its audio-visual feedback design. Finally,
we present an evaluation of the system, its potential use in
different multimedia contexts and in exploring embodied,
sonic and music interaction principles.
Item Type: | Conference or Workshop Item (Paper) | ||||
---|---|---|---|---|---|
Dates: |
|
||||
Subjects: | CAH25 - design, and creative and performing arts > CAH25-02 - performing arts > CAH25-02-02 - music | ||||
Divisions: | Faculty of Arts, Design and Media > Royal Birmingham Conservatoire | ||||
Depositing User: | James Dooley | ||||
Date Deposited: | 21 Jun 2019 11:29 | ||||
Last Modified: | 12 Jan 2022 16:54 | ||||
URI: | https://www.open-access.bcu.ac.uk/id/eprint/5003 |
Actions (login required)
View Item |