MYOSPAT: A SYSTEM FOR MANIPULATING SOUND AND LIGHT PROJECTIONS THROUGH HAND GESTURES

Di Donato, Balandino and Dooley, James (2017) MYOSPAT: A SYSTEM FOR MANIPULATING SOUND AND LIGHT PROJECTIONS THROUGH HAND GESTURES. In: Workshop on Intelligent Music Production 2017, 15th September, 2017, Salford, Manchester.

[img]
Preview
Text
WIMP2017_MYOSPAT_ A SYSTEM FOR MANIPULATING SOUND AND LIGHT PROJECTIONS THROUGH HAND GESTURES.pdf - Accepted Version

Download (193kB)

Abstract

MyoSpat is an interactive audio-visual system that aims to
augment musical performances by empowering musicians
and allowing them to directly manipulate sound and light
projections through hand gestures. We present the second
iteration of the system which draws from research findings
that emerged from an evaluation of the first system.
MyoSpat 2 is designed and developed using the Myo ges-
ture control armband as input device and Pure Data (Pd) as
gesture recognition and audio-visual engine. The system is
informed by human-computer interaction (HCI) principles:
tangible computing and embodied, sonic and music inter-
action design (MiXD). This paper reports a description of
the system and its audio-visual feedback design. Finally,
we present an evaluation of the system, its potential use in
different multimedia contexts and in exploring embodied,
sonic and music interaction principles.

Item Type: Conference or Workshop Item (Paper)
Subjects: W300 Music
Divisions: Faculty of Arts, Design and Media > Royal Birmingham Conservatoire
Depositing User: James Dooley
Date Deposited: 21 Jun 2019 11:29
Last Modified: 21 Jun 2019 11:29
URI: http://www.open-access.bcu.ac.uk/id/eprint/5003

Actions (login required)

View Item View Item

Research

In this section...