Screen Based Musical Interfaces as Semio
Screen Based Musical Interfaces as Semio
Keywords
Interfaces, interaction design, HCI, semiotics, actors, OSC,
mapping, interaction models, creative tools.
1. INTRODUCTION
In our work with ixi software [14][15], we have concentrated
on creating abstract screen-based interfaces for musical
performance on computers. These are graphical user interfaces
(GUIs) that do not necessarily relate to established conventions
in interface design, such as using buttons, knobs and sliders, nor
do they necessarily refer to musical metaphors such as the score
(timeline), the keyboard (rational/discrete pitch organisation) or
linear sequencing (such as in step sequencers or arpeggiators).
Instead we represent musical structures using abstract objects
that move, rotate, blink/bang or interact. The musician controls
those objects as if they were parts of an acoustic instrument,
using the mouse, the keyboard or other control devices. We
have created over 15 of these instruments – each exploring new Figure 1: StockSynth. Here the crosshair cursor serves as a
modes of interactivity where some of the unique qualities of the microphone that picks up sounds from the boxes that
computer are utilised in fun, inspirational and innovative ways. represent sound samples. The mic has adjustable scope (the
Qualities such as remembering the musician's actions, following circle). The boxes are moveable and the mic moves by
paths, interaction between agents, generativity, randomness, drawn or automatic trajectories or by dragging it with the
algorithmic calculations and artificial intelligence; all things mouse.
that our beloved acoustic instruments are not very good at.
Over the course of our work, we have developed a loose and 2. A SHORT NOTE ON INSTRUMENTS
informal language for these instruments – a semiotics that "Even simple physical instruments seem to hold more mystery
suggest to the musician what the functionality of each interface in their bodies than the most elaborate computer programs"
element is, and what it signifies in a musical context. Human [10]
Computer Interface (HCI) research [2][3][17][1][6] is usually
con-centrated on the chain of meaning from the software Both acoustic instruments and music software incorporate and
define the limits of what can be expressed with them. There are
special qualities found in both, but the struggle of designing,
Permission to make digital or hard copies of all or part of this work for building and mastering an acoustic instrument is different from
personal or classroom use is granted without fee provided that copies the endeavor of creating musical software. The acoustic
are not made or distributed for profit or commercial advantage and that instrument is made of physical material that defines the
copies bear this notice and the full citation on the first page. To copy behaviour of it in the form of both tangible and aural feedback.
otherwise, or republish, to post on servers or to redistribute to lists, These material properties are external to our thought and are
requires prior specific permission and/or a fee. something that we fight with when we design and learn to play
NIME 06, June 4-8, 2006, Paris, France. instruments. Such features or characteristics of the material
Copyright remains with the author(s).
instrument are not to be found in software. Software is per
162
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France
definition programmed (etymology: "pro" = before, "graphein" If again we talk about the locality where thinking takes place
= written); its functionality is prewritten by a designer or an we have a right to say that this locality is the paper on which
engineer and the decisions taken in the design process become we write or the mouth which speaks. And if we talk of the head
the defining qualities of the software, determining its expressive or the brain as the locality of thought, this is using the 'locality
scope. of thinking' in a different sense. [21]
Different languages are based on different paradigms and lead If here I am attempting to find the "locus" of musical
to different types of approaches to solve a given problem. Those thinking/performing in both acoustic instruments and screen-
who use a particular computer language learn to think in that based digital instruments – a discussion that is much deeper
language and can see problems in terms of how a solution than can be delved into here – it is important to consider the
would look in that language.1 [12] difference in embodiment and incorporated knowledge of the
player in those two types of instruments. When learning an
This is not the place to go into the cognitive processes involved acoustic instrument, the motor memory does most of the job
with learning and playing an instrument. But we are faced with and your learning "happens" as interaction with the body of the
an important question: what material (instruments) is the
instrument. Due to the material qualities of it, one can never
computer musician composing for and where does he or she get master an instrument, it always contains something unexplored,
the ideas from? In other terms: where does the thinking (or some techniques that can be taken further and investigated.
composing) of the computer musician or digital instrument With software however, it is more or less visual and procedural
inventor take place? It happens most likely in the form and memory that is involved, as software doesn't have a material
structure of the programming language in which he or she is body that the musician learns to operate. The only “body” of
working. The environment defines the possibilities and the software is in the form of its interface elements, and they (as
limitations of what can be thought. But what does it mean to opposed to the indicative nature of physical material) are
"learn to think in a language"? What are we gaining and what
simple, contingent and often arbitrary design decisions.3 The
are we sacrificing when we choose an instrument or a "body" of the software has to be created and it does not depend
programming environment? And what are the reasons for some upon any material qualities, but rather the style and history of
people preferring one environment for another? graphical user interface design.
163
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France
3.2 The Semiotics of a Creative Tool phenomena that we can use as source for our interface
The most common of semiotic practises is to look at the metaphors.6
signifying channel from the sender to the receiver through some
medium such as signs, language, text, or film. [5][9] The 4.1 Interaction Models
“work” here is a static construction that doesn't change after it Each of the ixi applications is a prototype or a suggestion and it
has been published or released.5 By contrast, computer-based explores a specific mode of interaction. The whole of our
works are interactive and can be changed or modified after their software can be grouped into a specific kind of interaction
release either by users themselves or by updates. Interaction model: a language, a semiotics or a design ideology that
becomes a new sign-feature.[2] Some studies have been done informs and en-forms the work. An interaction model can be
on this new semiotic quality of the computer [1][2][3][7], but defined as more operational than an interaction paradigm
very few in the field of music software or other creative (computer as tool, partner or medium). [6] It can be evaluated
software. according to the descriptive, the evaluative and the generative
In music software, the user is at the same time the receiver and power of the model. These dimensions of evaluation are all
interpreter of information from the designers of the software important when creating an interaction model. The descriptive
and the sender of information in the form of the music being power is the ability to describe a significant range of existing
composed using the tool. This dual semiotic stance is important interfaces; the evaluative power helps us to assess multiple
in all tools (whether real or virtual) but becomes vital in design alternatives; and the generative power is the ability of
contingently designed tools such as music software. Music the model to inspire and lead designers to create new designs
and solutions.
software is a sign system in its own right, but the important
question here is: which are the relevant layers of signification
and communication and from where do the originate? This can 4.2 Interaction Instruments
be analysed into strata of different practices. The hardware It is the generative aspect of ixi's interaction model that is the
designers, the programmers of the compilers, the language API subject here. Beaudouin-Lafon's definition of instrumental
and the software itself, the designers of the interaction and the interaction [7] is the closest description the author has found
programmers of the interface. A creative tool has history of that relates to our work with ixi software. The interaction
important design decisions all shaping its scope and potential. instrument is a tool that interfaces the user with the object of
This is a complex structure, but the user is faced with the interest. A scrollbar is an example of such instrument as it gives
question: what is the meaning conveyed in the interface? And is the user the ability to change the state/view of the document. A
this system of signification not essentially of compositional pen, brush or a selection tool in a graphics package is also a
nature? Who took those decisions and by which criteria? type of such instrument.
The contingency of design mentioned above in relation to the There are three design principles that define the methodology of
digital medium is one of the most definable characteristic of it. instrumental interaction: reification - the process by which
We don't have this “contingency problem” when designing concepts are turned into objects; polymorphism - the property
acoustic instruments as the properties of the material we work that enables a single command to be applicable to objects of
with leads us in our design: closing a hole in a flute increases different types; reuse - the storing of previous input or output
the wavelength in the resonant tube and the tone deepens; for another use. When an ixi application combines all three
pressing the string against the fingerboard of a guitar – design principles into a successful interface, we have what we
shortening the wavelength – produces a note of higher pitch. could call a semiotic machine. The interface is multifunctional
When designing screen-based computer interfaces we can and can be used in a variety of different contexts.
choose to imitate physical laws as known from the world of
acoustic instruments, we can draw from the reservoir of HCI
4.3 The Terminology of ixi’s semantics
techniques or we can design something entirely new. It is here As explained in earlier papers, [14][15] most of the ixi software
that interface design, the interaction design, and mapping applications are controllers that send and receive OSC (Open
becomes very important factor in the creation of interesting Sound Control) [23] information to sound engines written in
screen-based instruments for the computer. other environments such as SuperCollider [12] or Pure Data
[17]. We separate the interface from the sound engine in order
to be able to reuse the control structures of the abstract interface
4. INTERFACE ELEMENTS IN IXI in other contexts, for example allowing a sequencing interface
Most modern operating systems are graphical or allow for a to control parameters in synthesis if the user configures it so.
graphical front end. The WIMP (Window, Icon, Menu, Pointer) These controllers are all made from a common ideology or an
interface [4] has become a standard practice and we have interaction model that we see as a semiotic system.
become used to the direct manipulation [20] of graphical
objects. The traditional method is to translate work practices In our work with ixi software, the fundamental attention has
from the real world into the realm of the computer, and thus we been on the interaction design and not the interface design. The
get the folders, the documents, the desktop and the trash. In design of interface elements is often highly (but not
music applications we get representations of keyboards, buttons exclusively) aesthetic and depending on taste, whereas the
knobs and sliders, rack effect units and cables. This is also interaction design deals with the fundamental structure and
suitable where the aim is to translate studio work practices into ergonomic idea of the software. In the example of SpinDrum
the virtual studio. But when we are creating new instruments [14], for example, the wheels contain pedals controlling beats
using the new signal processing capabilities and artificial per cycle, the size of the wheel signifies the volume and the
intelligence of the computer, there might not exist any physical
6
As we can derive from the Peircian semiotics, an interface object can
be represented in various ways: iconically (where the representation
5
Post-structuralist thought has rightly pointed out how interpretations is based on resemblance to an object), indexically (where the
of the work change in different times and cultures, but the work itself representation is influenced by an object) or symbolically (where the
doesn't change - only people's interpretation and reception of it. representation is based on convention).
164
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France
colour accounts for which sound is attached to the object. Here The interface units that we call actors - such as a picker, a
the interaction design clearly affects the interface design (size, spindrum or a virus - are not instruments that the musician uses
number of pedals, colour), but the shape of the pedals (whether for some task and then chooses another instrument for the next
a square, a circle or a triangle) is simply an aesthetic decision task. The actors in the ixi software applications are put into use
and of little general importance. at some point in time and they continue working in a temporal
flow (rotating, moving through a trajectory or interacting) until
the musician decides to stop or pause their activities.
4.3.2 Context
All actors perform their task in a context. They are graphically
represented in a two- or three-dimensional space on the screen
and their location might typically influence their properties. The
actors move, rotate or blink in this space and are therefore both
spatially and temporally active units. The space can have
qualities such as temperature, gravity, brightness, etc. which are
all qualities that could affect the actor’s behaviour or it can
contain other actors of different type that influence the
behaviour of the message sending actors. Feedback from users
of ixi software has shown us that people find the metaphor of an
actor presented in time and space useful to represent musical
actions and ideas. What the feedback also shows is that people
intuitively understand the metaphor of having actors on a stage
that perform some tasks that they – the directors of the piece –
are controlling.
Figure 3: SpinDrum. Each wheel contains from 1 to 10
pedals. The wheels rotate in various speeds, and when a
pedal hits top position (12 o’clock) it triggers the sample or
sends out OSC info to the soundengine. The X and Y
location of the wheels can affect parameters such as pitch
and panning.
4.3.1 Actors
The ixi interfaces are pattern generating machines with cogs
and bolts of varied significance. To sum up the basic design
ideas of ixi software we could say that it was the reification of
musical ideas into abstract graphical objects as control
mechanisms that act in time.7 We call these abstract objects
actors,8 as they are graphical representations of temporal
processes that act, enact and react to the user, to each other or
the system itself in a complex network of properties, relations
and teleology (desired states or end goals). Beaudouin-Lafon
calls graphical interface tools "interaction instruments", but we
cannot use that metaphor as an ixi application is a musical
instrument on it's own but also because of the different nature Figure 4: Connector. This software uses generative
of the interface units of ixi software. The feature under algorithms to decide where actors travel within a network of
discussion here is the difference musical applications have from connectors. There are probability charts that decide the
the ergonomically "single-threaded" or serial task-processing next move of an actor and when it enters a connector it
applications used for painting, text editing, programming, video triggers a MIDI note and/or a sound sample that is a
editing or in architecture. In contrast to these applications, a property of the connector.
music application is multi-threaded or parallel, i.e. there are
many processes, streams, layers or channels that run 4.3.3 Network
concurrently in every composition or performance, all When talking about the context and the environment of these
controlled by the user, but, in the case of ixi, usually only one at actors, we must note the fact that the interface elements are not
a time.9-10 the only actors in the context of an ixi instrument: the user is
one actor, the control hardware (a mouse, keyboard, sensor or
controller), the soundcard, the speakers and other
communication such as virtual audio cables, MIDI or OSC
7
Musical idea here meaning any pattern generating structure. messages. The whole context of musical action and reaction is
8
the space of the actor, a space in which the heterogeneous
We thought about calling the active interface elements agents but it network of musical performance takes place. The meaning of
was too confusing as the term has very strong connotations in
the actor is its functionality within the control context and the
computer science, especially within the field of artificial intelligence.
mapping context. The actor has as many dimensions as it has
9
Another fact that divides those types of software is that the painting numbers of control parameters and connections for receiving or
software, the video software or the 3D package are not packages that sending messages.
are used in live performance.
10
This is of course what people are working with in the research field
often known as NIME (New Interfaces for Musical Expression on the computer allows for multi-parameter mapping to one sound-
engine.
www.nime.org) where building physical interfaces to control sound
165
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France
6. FUTURE WORK
Future plans involve exploring the dimensional spaces of the
screen-based actors as the interface for musical interaction. The
Figure 5: ParaSpace. This application interfaces with audio computer is becoming quite good at imitating the properties of
effects written in SuperCollider (but can talk to any acoustic instruments but it excels as an interesting instrument
software that supports OSC). Each audio effect has variable on its own where interaction is designed from the premise of
number of parameters and they are represented as small the qualities of the computer and not by imitation of real world
boxes in the control interface of ParaSpace. The point here objects.
is that the parameters interact on the interface level with
automation, artificial life and artificial intelligence. Our work involves experimenting in creating semiotic systems
that can be taken further and extended into new dialects and
To clarify this idea of actors being all the elements that affect systems of meaning. This system is not exclusive to one type of
the interaction in an instrument, let us have a look at the applications, but can rather be seen as a semiotic toolbox from
software Connector. Here actors move in a system of which elements can be taken and reused in new contexts.
connectors (a plumbing-like system) and trigger sound samples Computer music software is a highly interesting area in the field
or MIDI notes that are properties of the connectors. The of HCI as it is used in live performances and should contain
connectors are actors themselves as they are the receivers of an depth that can be explored and practiced, thus allowing for
action and contain the information that yields the sound. It is musical virtuosity. In semiotic interfaces such as the ixi
through the interaction of all the actors and their properties that software there is always the filament of concurrent mappings or
interaction takes place – interaction between elements within parallel streams of musical events happening at any one time.
the instrument and also with the musician using the instrument The temporal aspect of computer music software makes it also
– and this interaction is simply the automation that controls the quite unique in relation to other types of software. Facing these
various parts of the music set into motion. In StockSynth incredible demands and challenges of music software we feel
(Figure 1) the microphone is one such actor (with its properties that we are just starting our journey into the possibilities of new
of trajectory and scope) that interacts with the sound objects meaning systems, metaphors, pattern generators and control of
that contain the information about the sound and its properties. synthesis techniques through the creation of semiotic machines
in the form of interfaces.
4.3.4 Semiotic elements and mapping
The actors and the contexts in which they function are all 7. ACKNOWLEDGEMENTS
elements in a semiotic language. This language has dialects or The ixi software project (www.ixi-software.net) is a
rather idiolects (each application is unique) where the meaning collaboration between Enrike Hurtado Mendieta, Thor
of an element can change as in Wittgenstein's concept of the Magnusson and various musicians and artists that are involved
usage as the word’s meaning [15][22] or as in the Saussurian with our work. I would like to thank Chris Thornton of
conception of the lack of natural connection between a signifier University of Sussex, Marcelo Wanderley and the people at the
and the signified.11 [19] We provide a semiotics or suggest IDMIL Lab at McGill University, Montreal and Julian
language games where the behaviour of an actor maps onto Rohrhuber at the University of Cologne for good discussions
some parameters in a sound engine. For example, vertical and important feedback for this paper.
location of an actor could signify the pitch of a tone or playback
rate of a sample. Size could mean amplitude, rotation 8. REFERENCES
triggering, and direction could mean a tendency for some
action. But, it could also signify something entirely different as
the controllers are open and it is up to the musician to map the [1] Andersen, Peter, B. & May, Michael. "Instrument
actor's behaviour onto a parameter in the sound engine. Semiotics" in Information, organisation and
technology. Studies in organisational semiotics. (eds.
5. CONCLUSION Liu, Kecheng; Clarke, Rodney J.; Andersen, Peter B.;
This paper has tried to show how the materials we work with Stamper, Ronald K.). Kluwer:
when we design instruments (digital or acoustic) are the Boston/Dordrecht/London. 2001: 271-298.
[2] Andersen, Peter, B. "What semiotics can and cannot do
11
For Saussure, the meaning of signs is relational rather than for HCI" in Knowledge Based Systems. Elsevier: 2001.
referential, i.e. the meaning lies in their systematic relation to [3] Andersen, Peter, B. "Computer semiotics" in
other signs in the semiotic system and not by direct reference Scandinavian Journal of Information systems. Vol. 4:
to material things, for example. 3-30. Copenhagen, 1992.
166
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France
[4] Apple Computer Inc. Human Interface Guidelines: The [15] Magnusson, Thor. "ixi software: Open Controllers for
Apple Desktop Interface. Boston: Addison-Wesley, 1987. Open Source Software" in Proceedings of ICMC 2005.
Barcelona: University of Pompeu Fabra, 2005.
[5] Barthes, Roland. Mythologies. London: Paladin, 1972.
[16] Magnusson, Thor. “Processor Art: Currents in the Process
[6] Beaudouin-Lafon, Michel. "Designing Interaction, not Orientated Works of Generative and Software Art”,
Interfaces" in AVI Proceedings, ACM: Gallipoli, 2004
Copenhagen: 2002. www.ixi-software.net/thor
[7] Beaudouin-Lafon, Michel. "Instrumental Interaction:
[17] Nadin, Mihai. “Interface Design: A Semiotic Paradigm” in
An Interaction Model for Designing Post-WIMP User
Semiotica 69-3/4. Amsterdam: Mouton De Groyer, 1988.
Interfaces" in Proceedings of ACM Human Factors in
Computing systems (CHI 2000), The Hague: ACM [18] Puckette, Miller. "Pure Data." in Proceedings of
Press, 2000. International Computer Music Conference. San Francisco:
International Computer Music Association, pp. 269-272,
[8] Beaudouin-Lafon, Michel. "Reification, Polymorphism
and Reuse: Three Principles for Designing Visual 1996.
Interfaces" in in AVI Proceedings, ACM: Palermo, 2000 [19] Saussure, Ferdinand de. Course in General Linguistics.
[9] Eco, Umberto. A Theory of Semiotics. London: Macmillan, London: Duckworth, 1983. p. 121.
1976. [20] Shneiderman, Ben. "Direct manipulation: a step beyond
[10] Edens, Aden. Sound Ideas: Music, Machines, and programming languages," IEEE Computer 16(8) (August
Experience. Minneapolis: University of Minnesoda Press, 1983), 57-69
2005. [21] Wittgenstein, Ludwig. The Blue And Brown Books,
Oxford: Blackwell, 1993. p. 6.
[11] Gohlke, Gerrit. (ed.) Software Art - A Reportage about
Source Code. Berlin: The Media Arts Lab, 2003. [22] Wittgenstein, Ludwig. Philosophical Investigations,
Oxford: Blackwell, 1994.
[12] McCartney, James. "A Few Quick Notes on
Opportunities and Pitfalls of the Application of [23] Wright, M; Freed, A; Lee, A; Madden, A. and Momeni,
Computers in Art and Music" in Ars Electronica, 2003. A. "Open Sound Control: State of the Art 2003" in
Proceedings of NIME 2003. Montreal, Canada. 2003.
[13] McCartney, James. "Rethinking the Computer Music
Language: SuperCollider" in Computer Music Journal,
26:4, pp. 61–68, Winter, 2002.
[14] Magnusson, Thor. "ixi software: The Interface as
Instrument" in Proceedings of NIME 2005. Vancouver:
University of British Columbia, 2005.
167