hunt2003 mapping
hunt2003 mapping
To cite this article: Andy Hunt , Marcelo M. Wanderley & Matthew Paradis (2003) The Importance of Parameter Mapping in
Electronic Instrument Design, Journal of New Music Research, 32:4, 429-440
Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained
in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no
representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of
the Content. Any opinions and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied
upon and should be independently verified with primary sources of information. Taylor and Francis shall
not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other
liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or
arising out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any substantial or systematic
reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any
form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://
www.tandfonline.com/page/terms-and-conditions
Journal of New Music Research 0929-8215/03/3204-429$16.00
2003, Vol. 32, No. 4, pp. 429–440 © Swets & Zeitlinger
Abstract
This paper presents a review of a series of experiments which is changed. Moreover, the emotional response elicited from
have contributed towards the understanding of the mapping the performer is shown to be determined to a great degree by
layer in electronic instruments. It challenges the assumption the mapping. Whereas the input devices establish the physi-
that an electronic instrument consists solely of an interface cality of the system and the synthesis methods govern the
and a sound generator. It emphasises the importance of the sound quality, the mapping somehow affects how the player
mapping between input parameters and sound parameters, reacts psychologically and musically to the instrument.
and suggests that this can define the very essence of an
instrument. The terms involved with mapping are defined,
and existing literature reviewed and summarised. A model 2. The importance of mapping
for understanding the design of such mapping strategies for
In this section we emphasise the dramatic effect that the style
electronic instruments is put forward, along with a roadmap
of mapping can have on “bringing an interface to life”. We
of ongoing research focussing on the testing and evaluation
focus on our own experience in designing digital musical
of such mapping strategies.
instruments and comment on several previous designs. An
extensive review of the available literature on mapping in
computer music has been presented by the authors (Hunt,
1. Introduction: Electronic instruments and 2000; Wanderley, 2001) and (Hunt & Wanderley, 2002). A
the mapping layer special issue of the journal Organised Sound has also been
In an acoustic instrument, the playing interface is inherently recently devoted to Mapping Strategies in Computer Music
bound up with the sound source. A violin’s string is both (Wanderley, 2002) and we refer the reader to it for a varied
part of the control mechanism and the sound generator. Since review of different approaches to mapping.
they are inseparable, the connections between the two are
complex, subtle and determined by physical laws. With elec-
2.1 Informal observations
tronic and computer instruments, the situation is dramatically
different. The interface is usually a completely separate piece The first author has carried out a number of experiments into
of equipment from the sound source. This means that the mapping. The more formal of these have been presented in
relationship between them has to be defined (see Fig. 1). detail (Hunt & Kirk, 2000; Hunt, 2000), and are summarised
The art of connecting these two, traditionally inseparable, later in this paper. We begin with some rather simpler, pre-
components of a real-time musical system (an art known as viously unpublished, observations that originally sparked
mapping) is not trivial. Indeed this paper hopes to stress that interest in this subject. We have retained the first person
by altering the mapping, even keeping the interface and writing style to denote that these are informal, personal
sound source constant, the entire character of the instrument reflections.
Amp
Osc
Freq
Rate of
change Amp
Scale
Fig. 1. A generic model of the mapping problem. Osc
+
Invert Freq
2.1.1 The accidental theremin
Several years, ago I was invited to test out some final uni-
versity projects in their prototype form in the lab. One of
them was a recreation of a Theremin with modern electronic Fig. 3. Complex Mapping for Experiment 2.
circuitry. What was particularly unusual about this was that
a wiring mistake by a student meant that the “volume”
antenna only worked when your hand was moving. In other I let several test subjects play freely with the instrument,
words the sound was only heard when there was a rate-of- and talked to them afterwards. In the second experimental
change of position, rather than the traditional position-only run, the interface was re-configured to emulate the above-
control. It was unexpectedly exciting to play. The volume mentioned “accidental Theremin.” One slider needed to be
hand needed to keep moving back and forth, rather like moved in order to make sound; the rate of change of move-
bowing an invisible violin. I noted the effect that this had on ment controlled the oscillator’s amplitude. But I decided to
myself and the other impromptu players in the room. Because complicate matters (on purpose!) to study the effect that this
of the need to keep moving, it felt as if your own energy was had on the users. The pitch, which was mainly controlled by
directly responsible for the sound. When you stopped, it the first slider, operated “upside-down” to most people’s
stopped. The subtleties of the bowing movement gave a expectations (i.e., pushing the slider up lowered the pitch).
complex texture to the amplitude. We were “hooked.” It took In addition the second slider (being moved for amplitude
rather a long time to prise each person away from the instru- control) was used to mildly offset the pitch – i.e., it was cross-
ment, as it was so engaging. I returned a week later and noted coupled to the first slider (Fig. 3).
with disappointment that the “mistake” had been corrected, A remarkable consistency of reaction was noted over the
deleted from the student’s notes, and the traditional form of six volunteers who tried both configurations. With Experi-
the instrument implemented. ment 1, they all commented within seconds that they had dis-
covered how the instrument worked (almost like giving it a
mental “tick”; “yes, this is volume, and this is pitch”). They
2.1.2 Two sliders and two sound parameters
half-heartedly tried to play something for a maximum of two
The above observation caused me to think about the psy- minutes, before declaring that they had “finished.” Problem
chological effect on the human player of “engagement” with solved.
an instrument. To investigate this further, I constructed a With Experiment 2, again there was a noted consistency
simple experiment. The interface for this experiment con- of response. At first there were grumbles. “What on earth is
sisted of two sliders on a MIDI module, and the sound source this doing?” “Hey – this is affecting the pitch” (implied cries
was a single oscillator with amplitude and frequency con- of “unfair,” “foul play”). But they all struggled with it – inter-
trols. In the first run of the experiment the mapping was estingly for several more minutes than the total time they
simply one-to-one, i.e., one slider directly controlled the spent on Experiment 1. After a while, their bodies started to
volume, and the other directly controlled the pitch (Fig. 2). move, as they developed ways of offsetting one slider against
Mapping in electronic instruments 431
Mouse cursor
On-screen on a blank
sliders: one screen
for each
parameter Sliders on a
MIDI fader Mouse
box
Mouse
On-screen
sliders: one
for each
Downloaded by [New York University] at 15:54 23 July 2015
parameter
Score
Sliders on a
MIDI fader
box
meter simultaneously. Surprisingly perhaps, they were users’ (subjective) views on the comparative expressivity
nearly all extremely frustrated by the four physical sliders. and learnability of each mapping, and the quality of musical
Comments abounded such as “I should be able to do this, control that could be achieved.
technically, but I can’t get my mind to split up the sound into Whilst users were experimenting with these instruments
these four finger controls.” Some users actually got quite an interesting trend was observed. The first (one-to-one) test
angry with the interface and with themselves. The multi- did not seem to hold the users’ attention. They treated it as a
parametric interface, on the other hand, was warmly received “task” to complete rather than an opportunity to experiment
– although not at the very beginning. At first it seemed a with a musical instrument. The second test received a more
difficult interface for most users, but they rapidly warmed favorable response. The third test seemed to inspire the users
to the fact that they could use complex gestural motions to to experiment and explore the interface to the point of com-
control several simultaneous parameters without having to posing and performing their own short melodies and musical
“de-code” them into individual streams. Many users re- gestures.
marked how “like an instrument” it was, or how “expressive” Users were asked to fill in a questionnaire relating to their
they felt they could be with it, but that it needed practice. feelings towards each experiment. The questionnaire asked
each user to score the instruments in different areas such as:
3.2 Focusing purely on the effect of mapping “How easy did you find it to control individual parameters?”
Downloaded by [New York University] at 15:54 23 July 2015
Fig. 10. The three mappings used in the clarinet simulation presented in Rovan (1997).
pared to acoustic instruments such as the clarinet or saxo- pendently in the MIDI controller, whereas they are cross-
phone. A common path to solving this problem involves coupled in acoustic single-reed instruments. This natural
improving the design of the controller by adding extra cross-coupling is the result of the physical behaviour of the
sensors. Rovan et al. (1997) discussed the fact that by alter- reed, and variables were simply independent since the equiv-
ing the mapping layer in a digital musical instrument and alent “reed” in the controller was a plastic piece that did not
keeping the interface (an off-the-shelf MIDI controller) and vibrate, and moreover, was not coupled to an air column.
sound source unchanged, the essential quality of the instru- Based on these decisions and facts, the authors proposed
ment is changed regarding its control and expressive different mappings between the WX7 variables and the syn-
capabilities. Previous studies, notably by Buxton (1986a), thesis parameters. The first was basically a one-to-one rela-
presented evidence that input devices with similar character- tionship, where variables were independent. The second was
istics (e.g., number of degrees of freedom) could lead to very a model where the “virtual airflow” through the reed (loud-
different application situations depending on the way these ness) was a function of both the breath and lip pressure
characteristics were arranged in the device. In that study, (embouchure), as in an acoustic instrument. The third was a
however, the devices were not the same mechanically (one model that took into account both the “virtual airflow” and
had two separate one-dimensional controllers and the other the relationship between spectrum content to breath and
one two-dimensional controller), so the situation is different embouchure; a model that would match even more closely
to using exactly the same input devices with different the real behaviour of the acoustic instrument.
mapping strategies. The two most important consequences of this work were:
Another point became clear in this process; even if
the WX7 was a reasonably faithful model of a saxophone 1. By just changing the mapping layer between the controller
providing the same types of control variables (breath, lip and the synthesis algorithm, it was indeed possible to
pressure and fingering), these variables worked totally inde- completely change the instrumental behaviour and thus
Mapping in electronic instruments 435
the instrument’s feel to the performer. Depending on the acting in a unified manner. It is also associated with feelings
performer’s previous experience and expectations, differ- of success and being uplifted – which certainly concurs with
ent mappings were preferred. the experiences of the users of the multiparametric interface.
2. By deconstructing the way that the reed actually works, it This contrasts greatly with the anger and frustration expe-
was noted that the choice of mapping could be important rienced by users of the parallel sliders interface. This may
as a pedagogical variable. Indeed, in stark contrast with come as a surprise to some, especially considering that such
acoustic instruments where the dependencies between banks of physical sliders are known the world over as the
parameters are unchangeable, cross-coupling between basis of the mixing desk. Fitzmaurice et al. (1997) conclude
variables can easily be created or destroyed in digital that slider banks (and other space-multiplexed graspable
musical instruments. This means that performers could interfaces) outperform time-multiplexed interfaces, such as
focus on specific aspects of the instrument by explicitly the mouse. Does our work contradict their findings? No,
defining its behaviour. but it does qualify them. One possible interpretation of
Fitzmaurice et al. is that we should abandon the mouse and
provide separate graspable interfaces, but our results show
this is not necessarily true. Where a mouse is used to time-
4. Analysis of experiments
multiplex (e.g., move sequentially between different on-
In this section we set the results of the above music-interface screen areas such as icons and menus) then it will be less
Downloaded by [New York University] at 15:54 23 July 2015
experiments in the context of work by other authors and from efficient than an interface where the same parameters are
mainstream Human Computer Interaction literature. physically graspable. Our work supports this, as for the sim-
The experiments in section 3.1 reveal results which are at plest tests – the physical sliders outperform the time-multi-
once challenging, and yet understandable. Experience with plexed “mouse” interface. However, in our final interface the
acoustic musical instruments, and their in-built complex mouse is used in a much more creative way – to provide con-
cross-couplings of control parameters, would tend to support tinuous multiparametric control. Users of the sliders inter-
our results that the complex, multi-parametric interface face had to mentally split down the holistic sound into its
performed significantly better. Our natural expectations of four sonic components before being able to derive the motor
having to practice hard to play an instrument are reflected in controls for each individual slider. Users were happy to do
the multiparametric interface’s slow start; it takes time to get this when only one parameter changed. For two simultane-
used to a complex mapping. However, it is notable that ous parameters it was much more difficult, but still just pos-
nobody questions the need to practice an instrument, yet in sible. With three and four parameters most people simply
the same breath everybody demands “easy-to-use” computer gave up, or tried until they were angry with frustration. This
interfaces. Much of the HCI literature concentrates on would seem to imply that space-multiplexed systems begin
making interfaces more direct, and less hard to learn. At first to fail when more than two parameters need to be changed
glance this would seem a laudable goal, even an obvious one. simultaneously.
But are we perhaps throwing out the proverbial baby with the Somehow the multiparametric interface is aiding the users
bathwater (in this case the bathwater represents interface to cope with handling simultaneous sonic variables. The
complexity and learning time, and the baby represents the work by Jacob et al. (1994) on Integrality and Separability
rewards of a flexible interface)? of input devices sheds light on how this may be happening.
So, why should a complex and cross-coupled interface Jacob’s work shows that an interface system work best where
completely outperform a more standard sliders-only one? the structure of the input devices matches the perceptual
Most users of the multiparametric interface describe a structure of the task being performed by the user. In other
moment when they “stopped thinking” about the interface words it is not just about which device is used (e.g., a mouse)
and began to “just play” with it – almost as if their body was but how that device is mapped onto the system parameters,
doing it for them, and their conscious mind was somehow and how well that helps the users to think about the task. In
“not in control.” This is remarkably close to what acoustic our case the users were hearing a holistic sound, yet the
musicians describe as the “flow” experience. In a fascinating sliders interface forced them to decode the sound into four
web-site for musicians, violinist and psychologist A. Burzik separate control streams – a very hard and frustrating task.
(2002) describes the four principles of practicing in flow as: In contrast the multiparametric interface allowed users to
think spatially about the simultaneous control of pitch,
1. A special physical contact with the instrument;
timbre and volume. Users reported listening to a sound and
2. The development of a subtle feeling for sound;
“just knowing what shape that was” in terms of their required
3. A feeling of effortlessness in the body;
movements on the interface.
4. A playful and free-spirited handling of the material
A further contribution to the success of the multipara-
studied.
metric interface could be the way in which the user’s two
“Flow” is the term coined by M. Csikszentmihalyi (1975) hands are used. Kabbash et al. (1994) describe how a well-
to describe a holistic state of consciousness where actions designed bimanual input can greatly outperform a single-
happen continuously, accompanied by sharpened senses handed control, but one hand is usually the dominant control
436 Andy Hunt et al.
in the other sense: for the same controller and the same set
of parameters, multiple synthesis techniques could be used
by just adapting the second mapping layer, the first being Controller Parameters
held constant. Specifically in this case, the choice of synthe-
sis algorithm is transparent for the user.
The two-layered model has been expanded to include
three mapping layers (Hunt & Wanderley, 2002; Arfib et al.,
2002). Figure 12 depicts a model containing these three
mapping layers.
These works support the idea that, by using multi-
layered mappings, one can obtain a level of flexibility in Meaningful Parameters
the design of instruments and that moreover, these models
can indeed accommodate the control of different media, such
as sound and video, in a coherent way. Section 5.1 describes
in more detail the justification for a multi-layer mapping
model.
Downloaded by [New York University] at 15:54 23 July 2015
ments for the construction of such mappings by answering collection and analysis of data over time relating to the accu-
the following questions. racy, speed, expressive control and suitability of interfaces
for specific tasks.
How do we define the mapping layer?
How can we measure the performance of a
In order to fully understand the workings of the mapping
mapping strategy?
layer it must firstly be fully defined. This will give researchers
a clear structure and vocabulary with which to refer to ele- How can the subjective data taken from a user’s experience
ments of a system. Current research concentrates on the of an interface be represented with actual performance data
mapping layer as a high level entity. Strategies are defined (data that can be recorded whilst a user is carrying out a spe-
(one-to-one etc.) along with layers which can carry out spe- cific task)? Can trends be observed in objective data to rep-
cific functions. It is proposed that a mapping layer should be resent Expressivity, or the enjoyment that a user experiences
studied at a much lower level in order to fully understand the in their interaction?
implications of a given parameter configuration. In all interface design the end result is ultimately a user
experience. Mapping strategies can provide a tailored expe-
rience dedicated to the task in hand, rather than using a global
What effect does a specific treatment of parameters
parameter association as can often be found with devices
Downloaded by [New York University] at 15:54 23 July 2015
interface effectiveness. We experimented with a multi-layer Hunt, A. (2000). Radical user interfaces for real-time musical
model to help designers to implement complex but usable control. DPhil thesis, University of York, UK. Available:
mappings, and are carrying out a programme of research to http://www-users.york.ac.uk/~elec18/download/adh_thesis/
look into the testing and evaluation of mapping strategies. Hunt, A., & Wanderley, M.M. (Eds.) (2000). Mapping of
We are still in the early stages of understanding the com- control variables to musical variables. Interactive systems
plexities of how the mapping layer affects the perception (and and instrument design in music working group. Website:
the playability) of an electronic instrument by its performer. www.igmusic.org
What we know is that it is a very important layer, and Hunt, A., & Kirk, R. (2000). Mapping strategies for musical per-
one that must not be overlooked by the designers of new formance. In: M. Wanderley, & M. Battier (Eds.), Trends in
instruments. Gestural Control of Music. IRCAM – Centre Pompidou.
Hunt, A., Wanderley, M.M., & Kirk, R. (2000). Towards a model
for instrumental mapping in expert musical interaction. In
Proc. of the 2000 International Computer Music Conference.
Acknowledgements San Francisco, CA: International Computer Music Associa-
Many thanks are due to the following people for their tion, pp. 209–211.
collaboration in the work described in this paper: Ross Hunt, A., & Wanderley, M.M. (2002). Mapping performance
Kirk, Butch Rovan, Shlomo Dubnov, Philippe Depalle, parameters to synthesis engines. Organised Sound, 7,
Downloaded by [New York University] at 15:54 23 July 2015
Wanderley, M.M., & Depalle, P. (1999). Contrôle gestuel de la Wanderley, M.M. (2001). Performer-instrument interaction.
synthèse sonore. In: H. Vinet, & F. Delalande (Eds.), Inter- Application to gestural control of sound synthesis. PhD
faces Homme-Machine et Creation Musicale – Hermes Thesis. University Paris VI, France.
Science Publishing, pp. 145–163. Wanderley, M.M. (Ed.) (2002). Mapping strategies for real-time
Wanderley, M.M. (Ed.) (2000). Interactive systems and instru- computer music. Organised Sound, 7.
ment design in music workgroup. Website: www.igmusic.org Wessel, D. (1979). Timbre space as a musical control structure.
Last visited on 10/04/03. Computer Music Journal, 3, 45–52.
Downloaded by [New York University] at 15:54 23 July 2015