0% found this document useful (0 votes)
40 views

Bean: A Digital Musical Instrument For Use in Music Therapy

This document describes the development of a digital musical instrument called Bean, which is intended for use in music therapy settings. It discusses how interactive technology can provide flexibility in music therapy and facilitate communication and expression for clients. The document outlines Bean's design and an initial evaluation with clients and therapists. It found that while technology offers benefits, designs must ensure clients easily understand their musical contributions and that control is intuitive. The evaluation helped improve Bean's design to better suit music therapy goals. Overall, the document examines how digital musical instruments can be designed for and effectively utilized in music therapy.

Uploaded by

Dan Bea
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Bean: A Digital Musical Instrument For Use in Music Therapy

This document describes the development of a digital musical instrument called Bean, which is intended for use in music therapy settings. It discusses how interactive technology can provide flexibility in music therapy and facilitate communication and expression for clients. The document outlines Bean's design and an initial evaluation with clients and therapists. It found that while technology offers benefits, designs must ensure clients easily understand their musical contributions and that control is intuitive. The evaluation helped improve Bean's design to better suit music therapy goals. Overall, the document examines how digital musical instruments can be designed for and effectively utilized in music therapy.

Uploaded by

Dan Bea
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Bean:

 A  Digital  Musical  Instrument  for  


use  in  music  therapy  
 
 
 
 
 

 
 
 
 
 
 
 
Nicholas  J.  Kirwan  
Department  of  Architecture,  Design  &  Media  Technology  
Medialogy  section  
Aalborg  University  Copenhagen  
[email protected]  
Winter  2014  
 

Abstract  
The  use  of  interactive  technology  in  Music  therapy  is  growing,  and  with  good  
reason.  The  flexibility  afforded  by  the  use  of  these  technologies  in  music  therapy  
is  substantial.  Presented  here,  are  the  initial  steps  in  development  of  a  Digital  
Musical  Instrument,  which  is  designed  for  use  in  a  music  therapy  setting.  An  
informal  evaluation  was  performed,  including  both  clients  and  therapists,  in  
order  to  assess  the  current  state  of  development,  and  provide  clues  for  optimal  
improvement  going  forward.  Both  the  strengths,  and  the  weaknesses  of  the  
design  at  the  time  of  the  evaluation,  were  assessed.  Using  this  information,  the  
current  design  has  been  updated,  and  is  now  closer  to  a  more  appropriate  state  
for  further  formal  evaluation.  

Introduction  
A   basic   working   definition   of   music   therapy   is   the   use   of   music   as   a   tool,   in   a  
therapeutic  setting.  Tailored  to  the  individual  needs  of  the  client,  this  tool  can  be  
used  to  achieve  therapeutic  goals  such  as  enabling  communication  or  improving  
motor   skills   [1].   The   flexible   nature   of   a   Digital   Musical   Instrument’s   (DMI)1  
sonic   output   and   control   possibilities   could   be   a   powerful   tool   to   add   to   the  
arsenal  of  a  music  therapist.  Indeed  it  has  been  shown  that  the  use  of  electronic  
musical   technologies   have   an   impact   on   outcomes   relating   to   communication  
and   expression   [2],   while   also   enabling   a   sense   of   achievement   and  
empowerment  [3].  As  mentioned,  communication  is  a  common  goal  in  this  form  
of   therapy.   Facilitating   performance   and   ancillary   gestures   through   tangible  
interaction,   could   therefore   lead   to   expressive   communication   when   combined  
with   music   [4].   For   some   clients   the   “up   to   date   technology”   itself   can   be   a  
positive   and   engaging   factor   in   music   therapy,   in   addition   to   the   possibility   for  
new   and   interesting   sounds   or   “new   sound   worlds”   [5].   The   use   of   novel  
technologies   in   music   therapy   can   however   pose   some   practical,   as   well   as  
design   problems.   For   instance,   can   clients   easily   understand   the   musical  
contribution  is  of  their  making?  Is  the  control  of  these  contributions  intuitive  and  
understandable?   Is   the   experience   of   using   these   technologies   engaging,   with  
enough  variance  to  hold  interest?  These  issues  are  not  directly  related  to  music  
therapy,  but  are  in  fact  universal  factors  associated  with  DMI  design,  for  example  
the  “ubiquitous   mapping   problem”  [6].  Effective  utilization  of  these  factors  could  
possibly   be   of   greater   importance   when   the   user   has   complex   needs.   In   this  
paper,   the   iterative   development   of   Bean,   a   novel   tangible   DMI,   will   be  
presented.  Bean  was  created  to  investigate  problems  like  the  above-­‐mentioned,  
and   to   help   provide   some   answers.   After   outlining   the   background   research  
relevant  to  the  design,  current  popular  technologies  used  in  music  therapy  will  
be   discussed.   After   this,   the   design   and   construction   process   of   Bean   will   be  
elaborated   on,   in   regard   to   the   design,   hardware   and   software.   Next   an   initial  
client   evaluation   will   be   described   followed   by   a   discussion/conclusion.   Finally  
future  plans  for  the  development  of  Bean  will  be  discussed.  
                                                                                                               
1“Digital  musical  instrument”  is  used  here  as  described  in  the  first  paragraphs  of  [15].  
Background  
As   mentioned,   the   use   of   technology   in   music   therapy   has   the   potential   for   many  
positive   applications,   but   the   therapist   must   have   the   required   knowledge   to  
effectively   use   these   technologies   in   a   therapeutic   setting   [7].   Research   has   been  
conducted   investigating   technology   use   in   music   therapy   [7][8].   Interestingly,  
these   studies   make   clear   that   distance   sensing2  is   the   most   frequently   used  
sensing  mode,  when  sensing  technologies  are  used.  However,  tangible  interface  
use   is   not   as   widespread   in   music   therapy,   which   is   understandable,   when   a  
percentage   of   clients   would   have   physical   disabilities   that   could   hinder   such  
interaction.  Despite  a  lack  of  total  inclusivity,  there  is  still  a  need  for  the  option  of  
tangible   interaction   for   clients   with   this   ability   to   ideally   enable   an   embodied  
musical  experience.  
A   framework   for   the   use   of   music   therapy   related   technologies   was   developed  
through   an   investigation   of   music   therapists’   experience   with   technology   use  
[2](Figure   1).   The   data   gathered   here   can   in   part,   be   used   to   effectively   design  
technologies   suiting   this   setting.   After   reflection,   the   first   two   points   are   aimed  
more   at   informing   the   therapist,   and   have   little   relevance   to   the   design   of  
instruments.   In   essence   these   points   address   the   resources   available   to   the  
therapist,  and  the  evaluation  of  suitable  sensor  technologies  to  fit  the  individual  
client’s   challenges   and   needs.   Of   course   “understanding   movement”   could   be  
seen   as   a   relevant   element   of   developing   a   gesture-­‐based   instrument.   The  
context  as  used  by  Magee  et  Al.  (2008)  in  this  framework,  seems  to  be  describing  
a  subjective  evaluation  of  the  client’s  physical  needs  by  the  therapist.  The  three  
last  points  can  be  intrinsically  linked  to  the  functionality  and  design  of  DMIs  such  
as   Bean.   These   elements   in   the   context   of   DMI   design   would   however   be   more  
intuitive  in  the  following  order:  Cause/effect  and  a  sense  of  agency  is  a  primary  
element.  After  this,  comes  enabling  the  client  through  effective  mapping,  which  
should  leading  to  musical  play  that  holds  the  interest  of  the  user.  

 
Figure   1   Framework   for   technology   use   in   music   therapy,   adapted   to   aid   the   design   of  
therapeutically  oriented  DMIs.  
                                                                                                               
2  Referring  to  infrared  distance  sensing,  which  is  used  in  Soundbeam.  

http://www.soundbeam.co.uk/    
 

Cause  and  effect    


“Musical  relationship  between  client  and  therapist  depends  upon  establishing  the  
client’s  understanding  that  they  are  the  agent  of  music  making”.

[2]

This   statement   could   be   seen   as   alluding   to   a   sense   of   agency3  [9].   Paine   et   Al.  
(2009),   categorize   agency   into   two   approaches   in   relation   to   DMI   design.   The  
first:   the   control   of   predetermined   sequences   of   sounds   such   as   triggering  
sounds   in   sample   based   software.   The   second:   the   creation   of   sound   through  
real-­‐time   manipulation   of   software   synthesis   variables.   Furthermore,   when   the  
creation   paradigm   is   designed   for,   it   is   suggested   immediate   agency   should   be  
facilitated   accounting   for   primary   causality   in   the   use   of   the   DMI.   Immediate  
agency   and   corresponding   feedback   could   be   seen   as   modeling   the   cause   and  
effect  cycle.

Enabling  the  client  


Magee  et  Al.  (2008)  take  a  practical  view  to  enabling  the  client,  with  mention  of  
switch   or   sensor   placement   in   relation   to   the   client’s   difficulties   that   is   very  
similar   to   the   aforementioned   understanding  movement   element   of   the   process.  
The   focus   is   mostly   on   physical   impairments.   Mapping   is   nonetheless   also  
rudimentarily  mentioned.    
From   an   interactive   interface   design   point   of   view,   the   main   tool   in   enabling   a  
client/user   would   be   through   effective   implementation   of   mapping.   The  
importance  of  which  has  been  investigated  [10].  Mapping  can  be  said  to  largely  
define   the   user   interaction   and   experience   [6].   An   effective   mapping   strategy  
would  enable  the  client  to  effectively  interact  with  the  musical  content.  A  client  
with   complex   needs   might   benefit   also   from   a   transparent   mapping   strategy,  
which   could   be   complemented   by   cross   modal   feedback   such   as   visual   cues  
similar   to   those   discussed   in   [11]   p52.   The   use   of   transparency   in   this   context  
can   be   defined   as   an   easily   understandable   connection   from   action   to   audible  
change.    
Musical  Play  
Playing   music   is   inherent   in   music   therapy,   but   the   quality   aesthetically,   is  
secondary   to   the   effectiveness   of   the   use   of   music   as   a   tool   to   achieve   a   goal.  
Sound  design  and  the  aural  feedback  framework  are,  along  with  mapping,  central  
to   this   topic.   The   effectiveness   of   the   sound   design   and   amount   of   control   over  
these   sounds   can   have   an   influence   on   the   amount   of   time   a   client   is   willing   to  
spend  playing  the  instrument.  Effective  integration  between  these  aspects  could  
lead   to   sustained   play.   It   is   not   necessarily   the   quality   of   musical   content,   but  
rather   the   sustained   interest   in   the   content,   which   in   turn   provides   a   tool   to  
possibly  facilitate  communication  and  expression  in  a  therapeutic  setting.    

                                                                                                               
3  Agency  as  a  term  has  many  different  meanings,  but  the  use  of  it  in  this  paper  can  be  defined  as  

the  ability  to  act,  and  understand  the  causal  significance  of  ones  actions.  Feedback  is  paramount  
in  facilitating  agency  in  the  context  of  DMI  use  [12].  
Related  work  
In   this   section,   some   of   the   most   popular   interactive   technologies   currently   in  
use   in   music   therapy   will   be   looked   at.   According   to   a   survey   including   over   600  
therapists   from   around   the   globe,   Soundbeam   was   the   most   popular   system   of  
interactive   technology   in   use[8].   Soundbeam   was   followed   in   second   place   by  
MIDIcreator4.   As   regards   tangible   interfaces   a   notable   commercially   available  
example  is  the  Skoog5  [7]  (Figure  2).  
 

 
Figure  2  Examples  of  technologies  used  in  music  therapy  

Soundbeam  
The   Soundbeam   is   a   powerful   complete   system.   Incorporated   in   it   are   sample  
based   sound   production   and   preset   sequencer   aspects.   The   method   of  
interaction   is   through   distance   sensing   and   switches.   It   provides   a   particularly  
effective  platform  for  those  clients  with  physical  disabilities,  where  the  smallest  
of   movement,   even   chest   movement   when   breathing,   can   be   translated   to   sound.  
Via   MIDI  6  this   system   can   communicate   with   external   hardware   and   software  
midi   enabled   music   devices.   However,   the   system   lacks   an   option   for   tangible  
embodied  interaction.  

Midicreator  
This   system   is   very   similar   to   Soundbeam   in   that   it   is   self-­‐sufficient   interactive  
music  making  system.  Various  plug  and  play  sensors  can  be  used  to  trigger  the  
preset   sounds.   MIDIcreator   offers   more   varied   sensor   choice;   apart   from   motion  
sensing  there  is  for  example  squeeze  sensing,  two  axis  acceleration  sensing  and  a  
cushion   weight   sensor.   Like   the   Soundbeam,   MIDIcreator   also   has   MIDI   pass-­‐
through  functionality.  

                                                                                                               
4  http://www.midicreator-­‐resources.co.uk/  
5  http://www.skoogmusic.com/  
6  http://www.midi.org/aboutmidi/index.php  
Skoog  
The  Skoog  is  a  commercially  available  tangible  interface,  which  is  aimed  towards  
novice   musicians.   It   is   also   well   suited   to   music   therapy   situations,   and   in   fact  
there  were  preliminary  case  studies  carried  out  where  no  adverse  effects  to  the  
Skoog   were   encountered7.   The   modes   of   interaction   the   Skoog   provides   are  
tapping,  shaking,  squeezing  and  twisting,  which  are  assignable  through  use  of  the  
supplied   software.   There   is   a   very   interesting   visual/colour   cue   system   designed  
to   aid   pedagogically.   Structurally   the   Skoog   is   a   cube   like   shape   with   five   semi  
spherical   protruding   buttons.   Each   button   is   a   different   colour,   which   is  
mirrored,   in   the   accompanying   software.   The   interface   can   therefore   be   played  
by   colour   cues   through   visual   feedback   from   the   software.   As   with   the   other  
systems  mentioned  so  far,  the  Skoog  has  MIDI  out  capabilities.  There  is  no  spatial  
change  sensing;  the  Skoog  is  a  static  interface.  

Bean  
Bean   is   a   novel   gesturally   controlled   digital   musical   instrument.   The   user  
interaction  is  minimalistic,  consisting  of  the  spatial  movement  of  the  instrument,  
along  with  two  push  buttons.  The  instrument  is  played  by  a  combination  of  these  
two   modes   of   interaction.   This   simplicity   was   an   intentional   design   feature,   with  
transparency  in  mind.  Although  primarily  a  musical  instrument,  there  are  some  
visual   aspects   integrated   in   Bean.   Direct   visual   feedback   from   the   instrument  
itself   is   mirrored   in   accompanying   software,   where   a   3D   virtual   representation  
of  the  instrument  can  be  seen.  These  aspects  were  also  developed  with  an  aim  to  
encourage  an  immediate  sense  of  agency.  In  essence  the  instrument  can  be  seen  
as  having  both  physical  and  a  virtual  segments.  The  construction  of  Bean,  as  well  
as  an  outline  of  the  system  which  runs  behind  it,  will  be  elaborated  on  next.  

Constructing  Bean  
Initial  Design  
Bean   is   ellipsoidal   in   shape,   which   innately   fits   well   between   two   hands.   The  
initial  step  to  realize  this  shape  for  rapid  prototyping  was  the  use  of  3D  imaging  
software.   Meshmixer8  was   used   to   firstly   create   a   3D   model   of   an   ellipsoid.   After  
this   123D   Make9  was   used.   123D   Make   is   a   powerful   3D   modeling   software,  
which   facilitates   the   segmenting   of   3D   shapes,   to   provide   laser   cut-­‐able  
templates   (Figure   3).   These   templates   can   then   be   cut   and   used   to   reconstruct  
these  3D  shapes,  using  a  press-­‐fit  format.  The  Templates  were  then  transferred  
to   Corel   draw10,   which   is   the   graphics   software   which   drives   the   laser   cutter.  
Using   Corel   draw,   the   template   was   modified   to   enable   secure   attachment   of  
internal  hardware.  Several  iterations  where  cut,  during  a  fine-­‐tuning  process  for  
both   fit   and   size.   The   material   used   to   manufacture   the   press-­‐fit   skeleton   was  
3mm  hardboard.  Corel  draw  and  the  laser  cutter  were  also  used  to  cut  the  button  
tops  from  3mm  acrylic  sheet  material.  These  additions  were  needed  to  increase  
the  surface  of  the  pressable  area  on  each  of  the  buttons.  
                                                                                                               
7  http://www.skoogmusic.com/community/case-­‐study/watsonej-­‐summary  
8  http://www.meshmixer.com/  
9  http://www.123dapp.com/make  
10  http://www.coreldraw.com/rw/product/graphic-­‐design-­‐software/?hptrack=eu2bb1  
 

 
Figure   3   The   partially   assembled   press-­‐fit   structure,   showing   the   modification   for   the  
attachment  of  hardware.  

Finally,   the   outer   surface   covering   consists   of   layers   of   PVC   foil11,   covered   by   a  
double  layer  of  nylon  from  a  pair  of  stockings.  This  covering  has  a  dual  purpose.  
The   first   restricts   access   to   the   internal   hardware   by   enclosing   the   skeletal  
frame.   The   second   is   partly   cosmetic,   to   diffuse   the   internal   light   source   and  
make  Bean  pleasing  to  the  eye.  

Hardware  
Embedded  computing  is  at  the  heart  of  Bean  (figure  4).  Teensy  3.012,  a  compact  
Arduino13  compatible  USB  microcontroller,  is  the  “brain”  of  the  physical  segment  
of   the   instrument   i.e.   the   ellipsoid.   The   Teensy   board   powers   up   and   initiates  
communication   with   the   Wii   Nunchuck14  board.   It   then   receives   all   the   sensor  
data,   turns   the   relevant   data   into   direct   visual   feedback,   and   also   transmits   all  
the   data   further   over   serial   communication   to   the   computer.   For   ease   of  
connection   the   Teensy   was   mounted   on   a   custom   made   circuit   board,   which  
allowed   for   effective   connection   and   disconnection   with   both   the   Nunchuck  
board  and  the  LED.  
 

                                                                                                               
11  Also  known  as  cling  film.  Commonly  used  for  food  storage  purposes.  
12  https://www.pjrc.com/teensy/index.html  
13  http://arduino.cc/  
14  The  Wii  Nunchuck  is  a  controller  for  use  with  the  Nintendo  Wii  game  console.  
 
Figure  4  The  internal  layout  of  Bean  

The   sensor   unit   is   in   fact   a   modified   Wii   Nunchuck.   Modified   to   enable   the  
original   buttons   to   be   extended   away   from   the   body   of   the   Nunchuck,   and   be  
placed   on   the   outer   shell   of   the   instrument.   The   main   sensor   is   an   on-­‐board  
accelerometer   from   the   Nunchuck.   This   sensor   enables   movement   tracking   in  
both  the  pitch  (X-­‐axis)  and  roll  (Y-­‐axis),  and  jolt  detection  vertically  (Z-­‐axis).  The  
two  buttons  allow  extra  access  to  control  parameters.    
The   RGB,   or   multicolour,   LED   is   an   individually   addressable   LED,   containing   a  
WS280115  control   chip.   The   direct   visual   feedback   mentioned   earlier   is   provided  
by  light  from  this  onboard  LED.  

Software  
A   system   overview   can   be   seen   in   Figure   5,   a   data   flow   diagram   that   outlines   the  
process   of   turning   raw   sensor   data   into   aural   and   visual   feedback.   To   facilitate  
this  process,  a  number  of  software  solutions  were  created:  
 

 
Figure  5  A  data  flow  diagram  showing  the  sensor  data,  control  paths  and  feedback  of  the  system.  

                                                                                                               
15  http://www.adafruit.com/datasheets/WS2801.pdf  
Sensor  Input  
The  first  step  in  the  development  of  the  software  used  in  the  instrument  was  to  
program   the   Teensy   microcontroller.   An   Arduino   sketch   was   created   that  
enables   the   Teensy   to   initialize   the   Wii   Nunchuck,   by   using   the   I2C 16  
communication   protocol,   and   begin   receiving   the   sensor   data.   The   LED   is   also  
initialized   with   this   sketch,   and   is   communicated   with,   by   the   use   of   the   SPI17  
communication   protocol.   The   sketch   also   directly   maps   certain   sensor   data   to  
different   colours   produced   by   the   LED.   The   final   step   is   the   formatting   and  
transmission  of  the  sensor  data  over  the  serial  bus  to  the  laptop.  

Aural  feedback  
The   concept   behind   the   current   implementation   of   aural   feedback   is   that   of  
harmonic   backing   chords,   which   shift   autonomously.   This   harmony   provides   a  
musical   setting,   a   starting   point.   Over   this   the   client   has   the   opportunity   to  
improvise   using   a   solo   voice,   which   is   governed   by   certain   rules   to   enable   the  
client  to  easily  find  notes  that  fit  with  these  chords.  When  “fit”  is  used  here,  it  is  
with  the  understanding  that  music  is  subjective  in  manner,  and  that  people  may  
have   differing   thoughts   on   what   notes   successfully   fit   with   certain   chords.   The  
meaning  of  the  word  in  the  context  of  this  paper  alludes  to  the  fact  that  the  notes  
available  to  the  client  are  harmonically  consonant  with  the  backing  chords.    
The  harmonic  content  of  the  chords  is  noncomplex  in  nature.  The  four  chords  are  
Cmaj9,  Dmin9,  Emin7  and  Fmaj9.  These  chords  use  only  notes  from  the  C  major  
scale,   and   are   therefore   relatively   close   harmonically   speaking.   The   major  
advantage   of   using   these   chords   for   the   accompanying   element   of   the   aural  
feedback  is  that  all  the  elements  of  the  C  major  pentatonic  scale18  fit  with  these  
chords.  For  this  reason  the  notes  of  the  C  pentatonic  scale  are  used  for  the  solo  
voice   element   of   the   aural   feedback.   To   provide   more   content   to   choose   from,  
two   octaves   are   used,   totaling   10   tones,   which   is   available   for   the   user   to   choose  
from   during   a   solo.   There   is   also   another   group   of   tones   made   available   to   the  
user   when   the   instrument   is   shaken   briefly.   These   notes   constitute   an   A   blues  
scale19.   This   new   state   lasts   for   30   seconds,   providing   and   option   for   tonal  
variance   and   possible   dissonance   in   the   solo,   before   the   pentatonic   tone   mode   is  
re-­‐engaged.  
Pure  Data  
Aural   feedback   was   implemented   using   Pure   Data,   a   graphical   programming  
language.   Bean.pd   is   the   main   hub   where   the   sensor   data   is   received   and  
formatted.  Open  Sound  Protocol20  (OSC)  is  used  to  transmit  the  sensor  data  into  
this   patch.   Formatting,   in   this   context,   can   be   understood   in   this   way;   the  
accelerometer   data   and   the   current   state   of   both   buttons   are   transformed   into  
data  usable  by  the  synthesizers  and  control  elements,  e.g.  accelerometer  roll  data  
is   received   as   numbers   between   70-­‐170   then   scaled   to   a   number   between   0-­‐1.  
                                                                                                               
16  http://www.i2c-­‐bus.org/  
17  http://arduino.cc/en/Reference/SPI  
18  The  pentatonic  scale  is  possibly  best  known  from  the  black  keys  of  a  piano.  The  C  pentatonic  

scale  includes  the  notes  C,  D,  E,  G,  A  and  C  at  the  octave.  
19  The  A  blues  scale  is  a  common  scale  used  in  jazz  improvisation.  This  scale  has  a  dissonant  note  

available  in  this  context.  The  notes  are  A,  C,  D,  D#,  E,  G  and  A  at  the  octave.  D#  is  an  idiomatic  
dissonant  tone.  
20  http://opensoundcontrol.org/introduction-­‐osc  
This   is   done   for   practical   reasons;   a   number   between   0-­‐1   is   easier   for   the  
designer   to   relatively   assess.   There   are   also   OSC   control   messages   broadcast  
from  Bean.pd.  These  messages  are  composed  using  the  sub-­‐patch  OSCreturn.pd,  
and   have   the   purpose   of   controlling   certain   aspects   of   the   visual   feedback.   The  
reason  for  these  messages  will  be  elaborated  on  in  a  later  section  of  this  paper.  
The  frequencies  equating  to  the  notes  of  both  the  C  pentatonic  and  the  A  blues  
scales   are   held   in   sub-­‐patches   Cpenta.pd   and   Ablues.pd.   The   frequency,   when  
selected,   is   sent   to   the   synthesizer   sub-­‐patch   soloSynth.pd   and   the  
corresponding  note  is  played.  The  sub-­‐patch  soloSynth.pd  receives  the  frequency  
information  from  either  Cpenta.pd  or  Ablues.pd,  and  translates  these  frequencies  
to   notes;   the   users’   solo   voice   is   composed   of   these   notes.   soloSynth.pd   is   a  
monophonic   synthesizer.   The   method   of   sound   creation   is   a   combination   of  
additive   synthesis   and   frequency   modulation   synthesis.   The   additive   synthesis  
comprises   of   a   fundamental   and   three   partials.   These   partials   are   individually  
adjusted   in   amplitude   to   provide   an   element   of   timbre   change.   Frequency  
modulation  is  used  to  add  complexity  to  the  aural  content  of  the  users’  solo.  An  
ADSR21  envelope   is   also   implemented   here.   This   envelope   enables   amplitude  
shaping  of  the  output  from  soloSynth.pd,  which  leads  to  a  more  responsive  solo  
sound.    
Another   sub-­‐patch   within   Bean.pd,   namely   Backing.pd,   is   where   the  
accompanying   harmonic   element   of   the   aural   feedback   is   created.   Contained   in  
this   patch   is   a   bank   of   five   additive   synthesizers,   one   for   each   note   in   the  
harmony.   Each   of   these   synthesizers   in   turn   composes   a   tone,   constructed   of   a  
fundamental  and  three  partials.  These  tones  combine  to  give  a  full  yet  somehow  
open  sound.  The  four  chords  change  randomly  over  time  with  equally  weighted  
probability   for   each.   In   the   current   implementation   there   is   also   an   additional  
option  for  the  user  to  intentionally  change  the  accompanying  chord.    

Mapping  
The   mapping   strategy   for   Bean   is   generally   one-­‐to-­‐one   mapping,   however   in  
practice,  some  of  these  mappings  combine  naturally  through  gestures.  This  could  
be   described   as   an   extra   mapping   layer   [10].   In   the   case   of   Bean   when  
considering  the  intended  use,  this  strategy  was  considered  a  good  starting  point.    
The   selection   of   note   in   the   solo   voice   is   the   most   discernable   change   aurally.  
This   change   is   mapped   to   the   pitch   angle   of   the   instrument   (Figure   6).   When   the  
instrument  is  swiveled  downwards  on  the  X-­‐axis,  the  pitches  fall,  and  conversely  
when   the   instrument   is   swiveled   upwards   on   this   axis,   the   pitches   rise.   The  
scope   of   measurable   movement   is   divided   into   ten   to   facilitate   the   available  
notes.    
 

                                                                                                               
21  ASDR  stands  for  Attack,  Sustain,  Decay,  and  Release.        
 
Figure  6  The  accelerometer  control  data  useful  for  mapping.  

Change   in   roll   is   mapped   to   the   aforementioned   tonal   variation.   Rotational  


movement   on   the   Y-­‐axis   to   the   left   effectively   gives   a   more   bass   rich   sound.   This  
movement   is   mapped   to   a   reduction   in   amplitude   of   the   upper   partials   of   the  
additive  synthesis  component  in  the  solo  voice.  The  opposite  gesture,  rotation  to  
the  right,  produces  the  effect  of  a  strong  higher  frequency  element  to  the  sound.  
This   is   achieved   by   increasing   the   amplitude   of   the   three   upper   partials.   This  
increase   is   staggered   from   low   to   high   in   order   to   give   a   smooth   timbral  
alteration.  While  in  the  process  of  moving  Bean  though,  it  is  envisaged  that  these  
movements,   swivel   up/down   and   rotate   left/right,   will   become   elements   of  
dynamic   gestures   by   the   user.   Following   informal   observations,   swivel   and  
rotation   movements   alone   were   the   exception   rather   than   the   rule.   Dynamic  
gestures   essentially   create   a   combined   mapping   level,   which   could   provide   a  
sense   of   combined   causality,   with   continuous   control   of   both   pitch   and   timbre.  
Ideally   just   one   method   of   control   would   be   perceived,   affecting   multiple  
parameters.  
The   jolt   gesture   in   the   Z-­‐axis   is   currently   functioning   as   a   trigger,   which   when  
activated,  switches  the  scale  from  C  pentatonic  to  the  A  blues  scale.  This  change  
of  scale  automatically  resets  after  30  seconds  if  not  re-­‐triggered.  
The   two   buttons   also   have   the   possibility   to   have   major   effect   on   the   aural  
feedback.  The  button  situated  on  the  right  of  the  instrument,  is  assigned  as  a  play  
button.   When   this   button   is   pressed,   the   attack,   decay   and   sustain   part   of   the  
envelope   is   engaged   and   the   solo   voice   plays.   When   the   button   is   released   the  
release   part   of   the   envelope   engages   to   taper   off   the   amplitude.   This   is   an  
intrinsic   element   in   every   instrument,   the   initiation   of   sound.   Rather   than   a  
higher  level  continuous  control  model  where  the  user  would  interact  with  a  pre  
existing   sound   framework,   Bean   is   designed   with   the   creative   model,   as  
discussed  in  [9].  The  clear  causal  relationship  between  pressing  the  button  and  
the  solo  voice  creating  sound,  should  reinforce  a  sense  of  agency  [12].    
In  the  current  implementation  the  button  on  the  left  side  of  the  instrument  when  
pressed,   triggers   a   random   change   of   chord   in   the   harmonic   accompaniment.  
Due  to  the  stochastic  nature  of  chord  selection,  the  harmony  could  theoretically  
remain  unchanged  15  seconds  or  more,  possibly  leading  to  a  sense  of  harmonic  
stagnancy.   With   this   is   mind;   a   method   for   manually   changing   the   accompanying  
chord  was  introduced.    

Visual  Feedback  
The   LED   installed   inside   the   physical   element   of   the   instrument   provides  
primary   visual   feedback.   This   feedback   is   mirrored   in   a   secondary   visual  
feedback,   which   is   a   3D   virtual   representation   of   Bean.   The   reason   for   this  
representation   is   to   aid   in   reinforcing   the   causal   connection   of   physical  
movement  to  perceptual  change  in  visual  and  aural  feedback.  Through  iterative  
development,   the   functionality   of   this   feedback   has   been   revised   and   added   to.  
Colour   to   musical   note   mapping   was   implemented   to   provide   a   form   of   visual  
cueing.  To  facilitate  this  virtual  representation,  an  application  was  created,  using  
Processing22  which  is  a  java  based  software  development  environment.    
Visualbean  
The   Visualbean   application   facilitates   both   visual   feedback,   and   throughput   of  
data  from  the  Teensy  to  Pure  Data.  The  serial  data  from  Teensy  is  received  into  
Visualbean,  and  then  transformed  into  OSC  data,  which  is  then  sent  further  on  to  
Pure  Data.  
The   visual   representation   is   a   mirroring   3D   ellipsoid   e.g.   when   the   instrument   is  
tilted  on  the  X-­‐axis,  the  image  tilts  on  it’s  X-­‐axis.  As  mentioned  earlier,  colour  to  
tone  mapping  is  implemented  in  the  physical  part  of  the  instrument.  This  feature  
is  also  mirrored  in  Visualbean.  The  ellipsoid  changes  colour  to  match  the  internal  
LED,   which   represents   the   selected   note   (Figure   7).   A   possible   method   for  
assigning   colour   to   musical   tone   would   have   been   studying   examples   of  
chromesthesia23.   These   examples   however,   tend   to   be   subjective   in   nature   and  
colour   combinations   vary   from   person   to   person   [13].   It   was   decided   a   more  
logical   and   universal   approach   would   be   transposing   and   superimposing   the  
equal  temperament  frequencies  of  the  selected  notes  from  the  audible  range  to  
the  visual  frequency  range.  This  method  was  used  to  assign  colours  to  notes  in  
the   Bean   system24.   These   resultant   colour/note   combinations   can   be   seen   in  
table  1.  The  OSC  messages  received  back  from  Pure  Data  control  the  colour  of  the  
virtual  representation.  This  message  is  a  number  between  0-­‐9  corresponding  to  
the  currently  selected  note,  which  in  turn  is  mapped  to  the  assigned  colour.  
 

                                                                                                               
22  https://www.processing.org/  
23  Chromesthesia  is  the  most  common  form  of  synesthesia.  Hearing  sound  induces  the  sensation  

visually  of  colour  [13].  


24  http://www.lunarplanner.com/Harmonics/planetary-­‐harmonics.html  
 
Figure   7   Colour   change   in   the   virtual   representation   of   Bean   for   the   notes   C,   E   and   G   on  
the  higher  octave.  

Table  1  The  colour  to  note  combinations  for  the  C  pentatonic  scale.  

Note   C   D   E   G   A  
Colour   Green   Blue   Violet   Red   Orange  

Informal  Evaluation  
An   initial   evaluation   of   the   Bean   system   was   carried   out.   This   evaluation  
occurred  in  two  sessions  over  two  days.  

Session  1:  Clients  


The  first  session  took  place  in  DORAS,  a  Cope  foundation25  adult  training  center  
in  Cork  city,  Ireland.  Two  service  users  of  Cope  foundation  (Participant  A  and  B),  
along  with  a  member  of  staff,  agreed  to  participate  in  an  informal  evaluation  and  
participatory  design  session.  Both  clients  are  male,  were  in  their  early  twenties  
and  have  mild/  borderline  intellectual  disabilities.  This  was  an  informal  setting,  
not   therapeutic   in   nature.   Nevertheless,   this   was   a   valuable   opportunity   to  
initially   assess   the   instrument   with   a   prospective   target   group,   with   a   view   to  
gathering  information  for  further  development.  
The  evaluation  took  between  30-­‐35  minutes.  Both  participants  were  in  the  room  
simultaneously.  The  format  of  the  meeting  took  the  following  form:  The  first  20  
minutes  were  spent  with  the  two  participants  taking  turns  in  free  play  with  the  
instrument,   without   any   instruction.   After   this,   there   was   a   short   discussion  
about   the   device,   to   gauge   the   participants’   impressions,   and   level   of  
understanding.   The   session   then   continued,   with   the   participants   and   the   staff  
member  engaged  in  more  free  play  turn  taking.  

                                                                                                               
25  Cope  foundation  is  a  non-­‐profit  organization,  which  supports  approximately  2000  people  with  

intellectual  disabilities  in  Cork  city.  http://www.cope-­‐foundation.ie/  


The   prototype   used   for   the   evaluation   was   an   earlier   less   developed   iteration.  
There   was   no   outer   covering   on   the   prototype   and   there   was   also   no   internal  
LED.   The   visual   representation   in   the   Visualbean   app   did   not   change   colour  
when   notes   were   changed.   The   aural   feedback   was   fully   implemented   at   that  
stage.  

Free  play  
Participant   A   was   initially   hesitant   in   using   Bean.   His   interaction   was  
exploratory,   starting   with   just   moving   the   instrument   in   space,   registering   that  
the   representation   on   screen   was   mirroring   the   physical   movements.   Shortly  
after,   the   buttons   were   pressed,   with   resulting   surprise   when   the   solo   voice  
engaged.   Participant   B   was   more   direct   in   use,   engaging   the   play   button  
immediately.  This  was  to  be  expected,  as  he  could  see  the  first  participant’s  use  
of   the   device.   His   gestures   were   slow   and   deliberate   at   the   start,   but   quickly  
changed  to  moving  the  device  more  aggressively.    

User  Impressions  
An   open   discussion   followed   the   free   play.   Some   open   questions   were   posed.  
What  are  your  first  impressions?  Did  you  understand  the  control  functionality?  
Was  it  interesting  to  use?  How  would  you  change/improve  it?  
 
What  are  your  first  impressions?  
Both  participants  were  positive,  about  the  instrument.  It  was  different,  but  fun.  
Whether   this   fun   factor   was   because   the   technology   is   new,   or   the   fact   that  
making   music   was   facilitated   in   a   new   way,   was   unclear.   They   were   both  
nevertheless  eager  to  try  the  interface  again.  
Did  you  understand  the  control  functionality?  
Both  of  the  participants  understood  that  movement  affected  the  sound,  and  that  
the   play   button   had   to   be   pressed   to   solo.   The   change  chord   button   however   was  
a   mystery.   Participant   B   triggered   the   jolt   controlled   A   blues   scale;   the  
participants  did  not  realize  the  change  in  scale.  
Was  it  interesting  to  use?  
Both  participants  were  positive  about  using  the  device.  When  they  were  asked  in  
connection   to   interest,   if   they   could   see   themselves   using   the   instrument   for   a  
sustained   time,   they   both   answered   yes.   As   with   the   first   question   it   is   unclear   if  
the   opportunity   to   play   music,   or   the   opportunity   to   play   with   new   technology  
was  the  deciding  factor.    
How  would  you  change/improve  it?  
Both   participants   agreed   that   a   cover   for   the   surface   of   the   device   would   be   a  
good   idea.   Participant   A   also   felt   that   the   device   could   be   used   for   other  
purposes,   relating   to   computer   control.   The   member   of   staff   was   also   of   the  
opinion  that  the  device  was  very  flexible  and  could  be  used  for  other  purposes.  

Free  play  continued  


After   the   discussion,   the   participants   got   another   opportunity   to   play   the  
instrument.   During   both   of   these   free   play   sessions   contrasting   styles   of   use  
could   be   observed.   Participant   A   continued   with   a   more   methodical   style,  
actively   searching   certain   notes   and   evaluating   the   sound   changes.   In   contrast  
participant   B   (Figure   8)   was   more   interested   in   moving   the   device   as   fast   as  
possible,  not  as  selective  with  which  notes  he  played,  but  rather  getting  fast  runs  
up   and   down   the   scales.   The   movement   of   the   virtual   representation   was  
possibly   of   more   interest   than   the   sound   of   the   instrument   for   this   participant.  
The   member   of   staff   helping   in   the   evaluation   also   had   the   opportunity   to   play  
the  instrument  at  this  time.  He  put  forward  the  opinion  that  the  device  could  be  
very  beneficial  in  a  group  music  therapy  setting.  

 
Figure  8  Participant  B  playing  Bean  during  the  evaluation  session.  

Session  2:  Therapists  


There  was  also  an  opportunity  to   talk  with  a  practicing  music  therapist  and  also  
an   art   therapist.   This   was   also   informal,   but   it   was   an   opportunity   to   get   some  
feedback   from   those   who   would   possibly   use   this   tool   in   a   therapeutic   setting.  
The   music   therapist   is   an   experienced   musician   and   uses   an   improvisational  
approach   to   music   therapy.   He   has   some   experience   with   the   use   of   Soundbeam,  
but   aside   from   that   limited   experience   of   technology   use   in   therapy.   The   art  
therapist  also  had  limited  experience  with  tangible  technologies  in  therapy.    
Both   of   the   therapists   had   the   opportunity   to   play   the   instrument.   The   music  
therapist   was   the   first   to   use   the   interface,   and   immediately   wanted   more  
methods  of  control.  On  the  top  of  Bean  where  his  thumbs  naturally  rested  in  use,  
could  be  an  optional  placement  for  more  buttons,  he  suggested.  Also  he  felt  that  
the  aural  feedback  lacked  a  rhythmic  element  or  a  “beat”.    
After   considering   the   device’s   current   state,   he   felt   that   the   prototype   could   be  
easily  destroyed  by  some  of  his  users.  If  they  for  instance  became  frustrated  the  
gaps  in  the  outer  structure  were  finger  sized,  providing  a  grip  to  pull  the  device  
apart.    
The   art   therapist   was   positive   about   the   applications   a   device   like   Bean   could  
have   in   an   art   therapeutic   setting,   if   the   visual   feedback   was   more   flexible,   to  
perhaps  enable  drawing,  or  other  interesting  visual  effects.  In  effect  translating  
the   visual   cue   based   feedback   currently   implemented,   into   a   more   visually  
creative  virtual  canvas.  
Discussion/Conclusion  
Although   no   empirical   evidence   was   gathered   during   the   two   evaluation  
sessions,   valuable   information   was   gathered.   This   information   has   steered   the  
further  development  of  the  instrument  to  its  current  state.    
After   observing   the   two   participants’   free   play   sessions,   there   were   two  
interesting   paths   of   development   suggested.   Refine   the   musical   control   of   the  
instrument,   which   would   possibly   improve   the   user   experience   in   the   case   of  
Participant   A.   Or   alternatively,   promote   the   kinetic   aspects   of   the   instrument,  
which  were  a  feature  of  the  interaction  by  participant  B.  In  light  of  the  fact  that  
music  is  the  primary  tool  in  music  therapy,  it  was  logical  to  focus  on  developing  
playability   first.   Participant   A   seemed   very   conscious   of   choosing   individual  
notes,  and  listening  while  manipulating  the  sound.  It  was  decided  after  observing  
this,   some   form   of   visual   cueing   would   aid   in   note   selection.   An   idiosyncratic  
issue   with   Bean   is,   when   the   play   button   is   not   pressed,   it   is   difficult   to   know   the  
note  currently  selected.  Through  practice,  possibly  kinesthetic  memory  could  be  
built   up   to   match   position   to   note   but   this   is   not   a   practical   solution   for   music  
therapy.   Therefore,   the   earlier   mentioned   colour   to   note   visual   cue   was  
introduced.   The   LED   light   is   only   activated   in   the   device,   when   the   play   button   is  
pushed.   Due   to   matching   colour   between   the   instrument   and   the   virtual  
representation,   even   when   the   play   button   is   not   pressed   the   user   can   track  
which   note   is   currently   selected,   leading   to   the   possibility   of   conscious   note  
selection.  
Another   suggestion,   which   both   participants,   as   well   as   the   music   therapist  
mentioned   was   a   surface   cover.   Although   the   press   fit   structure   is   constructed  
out   of   relatively   fragile   material,   the   outer   covering   does   provide   a   certain  
amount   of   stability.   Finger   access   is   now   restricted,   along   with   the   possible  
temptation  to  heave  the  device  apart.    
The   implementation   of   extra   control   options,   also   mentioned   by   the   music  
therapist,   would   have   both   positive   and   negative   consequences.   Of   course,   an  
extra  dimension  of  potential  control  would  increase  the  expressive  possibilities  
of  the  instrument.  This  comes  nonetheless  with  a  caveat:  The  balance  of  control  
options   and   usability   must   be   carefully   maintained.   Users   with   complex   needs  
could   possibly   have   trouble   conceptually   managing   more   control   options.   The  
style  of  control  implemented  in  Bean  is  purposefully  minimalistic,  with  focus  on  
transparency  of  control.  With  this  minimalistic  style,  comes  the  risk  of  a  lack  of  
control   content   to   maintain   interest.   This   was   not   evident   in   the   initial  
evaluation   session   in   DORAS.   Both   participants   seemed   to   be   engaged   while  
using   the   device.   A   larger   scale,   formal   evaluation   would   be   needed   to   give   more  
conclusive  results  to  this  problem,  the  initial  results  are  nonetheless  promising.    
During  the  session  in  DORAS,  when  talking  to  the  two  participants,  the  fact  that  
Bean   was   a   new   device   using   up   to   date   technologies,   was   clearly   a   positive  
influence.  The  participants  were  interested,  and  one  could  even  say  motivated  by  
that   fact   alone,   before   interaction   even   took   place.   This   adds   weight   to   a   claim  
that   more   technology   use   in   music   therapy   could   have   positive   effects,   at   least  
relating  to  a  young  male  demographic,  similar  to  our  participants,  and  possibly  
not  exclusively  to  this  demographic.  
The   rhythmic   element   suggestion   mentioned   by   the   therapist   is   an   interesting  
one,  and  makes  sense.  Music  is  essentially  a  cultural  practice  [14],  and  in  order  to  
effectively  use  music  as  a  tool,  the  musical  content  might  have  a  bigger  impact  if  
it  was  socially  relevant  to  the  client.  In  some  forms  of  contemporary  music26  the  
“Beat”   could   be   seen   as   being   of   more   importance   than   harmonic   content.   This  
suggestion   is   certainly   food   for   thought   going   forward,   and   outlines   a   possible  
deficiency  in  the  current  musical  content  of  Bean.  
Visual   interactivity   changes   as   proposed   by   the   art   therapist,   were   interesting  
and   undoubtedly   an   avenue   of   development   for   a   broader   base   of   therapy  
options.   But,   after   some   preliminary   attempts   to   implement   basic   visual   art   with  
Bean,   to   focus   more   on   music   as   the   main   therapeutic   tool.   Visual   feedback   is  
used  solely  as  an  aid  in  the  facilitation  of  effective  use  of  this  tool.    
 
In  conclusion,  this  section  will  revisit  the  design  questions  mentioned  during  the  
introduction.  The  answers,  if  any,  found  during  the  initial  evaluation  process  will  
be  described,  while  also  outlining  the  deficiencies  in  the  system.  
Can  clients  easily  understand  the  musical  contribution  is  of  their  making?    
Bean   has   a   very   clear   cause   and   effect   element   in   the   play   button,   and   due   to   the  
surprised  reaction  when  the  button  was  engaged  first,  in  the  evaluation  session  
the   participants   indeed   primarily   understood   and   felt   agency   over   the   sound  
produced.  In  the  case  of  participant  A,  conscious  movements  exploring  different  
notes  exemplified  an  understanding  at  least  partially,  that  his  actions  produced  a  
change  aurally  as  well  as  visually.  Participant  B,  initially  showed  conscious  note  
selection,  but  it  was  clear  over  time  the  visual  representation  held  more  interest,  
and   the   aural   feedback   became   secondary;   just   a   side   effect   of   the   movement.  
Neither   participant   recognized   the   full   extent   of   the   influence   their   actions   had  
over  the  musical  content  e.g.  chord  change.  
Is  the  control  of  these  contributions  intuitive  and  understandable?    
There  are  mixed  results  in  relation  to  the  mapping  strategy.  In  general,  through  
observation,   and   the   following   discussion,   both   participants   understood   that  
their   movement   created   change   in   the   solo   voice   when   the   play   button   was  
pressed.   Either   participant   did   not   understand   the   change  chord   button.   It   may  
be   that   this   change   was   too   subtle,   not   helped   by   the   chords   changing  
autonomously   already.   It   appears   at   least   at   this   point,   that   stagnancy   in  
harmony   is   not   as   great   a   problem   as   envisaged.   This   button   could   possibly   be  
put   to   better   use   with   another,   more   directly   expressive,   control   change.   The   jolt  
scale   change   also   seemed   to   be   too   subtle   in   implementation   to   be   directly  
understood.   The   original   idea   with   this   jolt   implementation   was   to   provide   extra  
note  options  to  choose  from.  Perhaps  other  scales  could  be  tried,  or  another  use  
for  this  control  mode  would  be  advisable.    
Is  the  experience  of  using  these  technologies  engaging,  with  enough  variance  to  
hold  interest?  
As  mentioned,  the  participants  were  observed  to  be  interested  in  the  up  to  date  
technology   integral   to   the   device,   even   before   playing   Bean.   This   could   be   a  
biasing  factor  in  the  judgment  of  the  functionality  and  effectiveness  of  the  device.  
If   culturally   relevant   music   is   preferable   as   a   tool,   then   technology,   which   sparks  
interest   independently,   could   also   be   seen   as   a   good   starting   point   for   the  
development  of  an  aid  to  music  therapy.  There  were  also  clear  signs  that  it  was  
not   solely   a   technology   for   technology’s   sake   scenario   during   this   evaluation.  

                                                                                                               
26  In  Drum  and  Bass  music  for  instance,  the  beat  is  a  very  prominent  element.  
There  appeared  to  be  clearly  conscious  perception27  of  action  and  engaged  use  of  
Bean.  
This  paper  has  outlined  a  vital  initial  step  in  the  design  and  further  development  
of   a   digital   musical   instrument,   Bean,   which   is   primarily   designed   for   use   as   a  
novel   tool   in   the   arsenal   of   the   music   therapist.   Research   pertaining   to   the   fields  
of   music   therapy   practice,   DMI/NIME   design   and   human   computer   interaction  
has  guided  the  process.  An  initial  informal  evaluation  of  a  functioning  prototype  
by   a   possible   target   group   and   professionals   in   the   field   has   proved   to   be  
informative  for  the  further  development  of  Bean.  

Future  Plans  
It  is  clear  that  much  work  is  still  needed  on  some  aspects  of  the  system,  but  it  is  
safe   to   say,   there   is   a   firm   foundation   to   work   further   from   here.   The  
developments   carried   out   since   this   evaluation   have   improved   the   device  
structurally,   and   the   hope   is   that   the   instrument   now   has   better   playability   after  
visual   cueing   has   been   introduced.   Some   aspects   of   the   mapping   strategy   will  
also   be   reviewed,   such   as   the   change   chord   option.   This   could   possibly   be  
changed  to  an  option,  which  would  allow  extended  range,  similar  to  some  small  
midi  keyboard  controllers  offer.  The  jolt  option  could  also  be  revised,  to  possibly  
apply   a   dynamic   audio   effect,   such   as   a   phaser,   for   a   predetermined   amount   of  
time.  
To   provide   more   flexibility   in   sound   choice,   and   a   familiar   protocol   the   music  
therapists,   MIDI   messaging   could   be   implemented.   The   proliferation   of   MIDI  
device   use   in   music   therapy   would   suggest   that   it   would   be   preferable   to   have  
some   MIDI   functionality   integrated   in   the   system.   The   Bean.pd   patch   could   be  
developed  further  to  facilitate  flexibility  with  regards  MIDI  communication.    
There   are   plans   to   replicate   the   Bean   system,   in   order   to   enable   musically  
collaborative   therapeutic   group   work.   A   larger   scale   more   formal   evaluation  
would  however  be  a  next  step,  to  possibly  get  empirical  data,  informing  on  how  
Bean   would   perform   in   a   therapeutic   setting,   and   this   evaluation   was   a   vital   step  
in  preparing  the  instrument  optimally  of  this  test.      

Acknowledgements  
Many  thanks  go  to  the  service  users  and  staff  members  from  Cope  foundation  for  
facilitating   and   participating   in   this   evaluation.   Also,   thanks   to   both   therapists  
Eoin   Nash   and   Ed   Kuczaj,   who   generously   offered   their   professional   opinions   on  
Bean.     Thanks   to   Cumhur   Erkut   for   valuable   advice   and   guidance   throughout   the  
project.  And  last  but  not  least,  thanks  go  to  lovely  my  wife,  Louise,  for  bringing  
her  considerable  scientific  writing  knowledge  to  bear  in  a  time  of  need.  

                                                                                                               
27  Perception  is  used  here  to  describe  both  audio  and  visual  sensing  in  use  of  Bean.  
 

References  
[1]   K.  E.  Bruscia,  Defining  Music  Therapy.  Barcelona  Publishers,  1998,  p.  300.  

[2]   W.  L.  Magee  and  K.  Burland,  “An  Exploratory  Study  of  the  Use  of  Electronic  
Music  Technologies  in  Clinical  Music  Therapy,”  Nord.  J.  Music  Ther.,  vol.  17,  
no.  2,  pp.  124–141,  Jul.  2008.  

[3]   K.  Burland  and  W.  Magee,  “Developing  identities  using  music  technology  in  
therapeutic  settings,”  Psychol.  Music,  vol.  42,  no.  2,  pp.  177–189,  Nov.  2012.  

[4]   M.  Wanderley  and  B.  Vines,  “The  musical  significance  of  clarinetists’  
ancillary  gestures:  an  exploration  of  the  field,”  J.  New  Music    …,  2005.  

[5]   A.  Hunt,  R.  Kirk,  and  M.  Neighbour,  “Multiple  media  interfaces  for  music  
therapy,”  IEEE  Multimed.,  2004.  

[6]   J.  Malloch  and  M.  Wanderley,  “The  T-­‐Stick:  From  musical  interface  to  
musical  instrument,”  …    7th  Int.  Conf.  New  …,  2007.  

[7]   B.  Farrimond,  D.  Gillard,  D.  Bott,  and  D.  Lonie,  “Engagement  with  
Technology  in  Special  Educational  &  Disabled  Music  Settings,”  Youth  Music,  
2011.  

[8]   N.  D.  Hahna,  S.  Hadley,  V.  H.  Miller,  and  M.  Bonaventura,  “Music  technology  
usage  in  music  therapy:  A  survey  of  practice,”  Arts  Psychother.,  vol.  39,  no.  
5,  pp.  456–464,  Nov.  2012.  

[9]   G.  Paine  and  J.  Drummond,  “Developing  an  Ontology  of  New  Interfaces  for  
Realtime  Electronic  Music  Performance,”  …    Electroacoust.  Music  Stud.,  
2009.  

[10]   A.  Hunt,  M.  M.  Wanderley,  and  M.  Paradis,  “The  Importance  of  Parameter  
Mapping  in  Electronic  Instrument  Design,”  J.  New  Music  Res.,  vol.  32,  no.  4,  
pp.  429–440,  Dec.  2003.  

[11]   S.  Fels  and  M.  Lyons,  “NIME  2011  Tutorial  NIME  Primer  University  of  
British  Columbia,”  2011.  

[12]   P.  Wyeth,  “Agency,  tangible  technology  and  young  children,”  IDC  ’07  Proc.  
6th  Int.  Conf.  Interact.  Des.  Child.,  pp.  101–104,  2007.  

[13]   G.  Rogers,  “Four  cases  of  pitch-­‐specific  chromesthesia  in  trained  musicians  
with  absolute  pitch,”  Psychol.  Music,  1987.  
[14]   A.  Tanaka,  “Interaction,  experience  and  the  future  of  music,”  in  Consuming  
Music  Together,  K.  Hara  and  B.  Brown,  Eds.  Springer,  2006.  

[15]   J.  Solis  and  K.  Ng,  “Input  Devices  and  Music  Interaction,”  in  Musical  robots  
and  interactive  multimodal  systems,  Solis.,  2011.    

You might also like