0% found this document useful (0 votes)
12 views6 pages

Screen Based Musical Interfaces as Semio

Uploaded by

Bonxi BonBon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views6 pages

Screen Based Musical Interfaces as Semio

Uploaded by

Bonxi BonBon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France

Screen-Based Musical Interfaces as Semiotic Machines


Thor Magnusson
Creative Systems Lab
University of Sussex
BN1 9QN, East Sussex, UK
[email protected]

ABSTRACT designer to the software user. The user is the receiver of


The ixi software project started in 2000 with the intention to information and the aim of HCI is traditionally to make the
explore new interactive patterns and virtual interfaces in interaction between the two systems (the human and the
computer music software. The aim of this paper is not to computer) intuitive, representational and task based (where the
describe these programs, as they have been described elsewhere tasks are based on real world tasks). What is lacking is a
[14][15], but rather explicate the theoretical background that stronger discussion of the situation where the computer is used
underlies the design of these screen-based instruments. After an as a tool for artistic creation – an expressive instrument – and
analysis of the similarities and differences in the design of not a device for preparing, organising or receiving information.
acoustic and screen-based instruments, the paper describes how In artistic tools we have an important addition, where the
the creation of an interface is essentially the creation of a signifying chain has been reversed: the meaning is created by
semiotic system that affects and influences the musician and the the user, deploying a software to achieve some end goals, but
composer. Finally the terminology of this semiotics is explained this very software is also a system of representational meanings,
as an interaction model. thus influencing and coercing the artist into certain work
patterns.

Keywords
Interfaces, interaction design, HCI, semiotics, actors, OSC,
mapping, interaction models, creative tools.

1. INTRODUCTION
In our work with ixi software [14][15], we have concentrated
on creating abstract screen-based interfaces for musical
performance on computers. These are graphical user interfaces
(GUIs) that do not necessarily relate to established conventions
in interface design, such as using buttons, knobs and sliders, nor
do they necessarily refer to musical metaphors such as the score
(timeline), the keyboard (rational/discrete pitch organisation) or
linear sequencing (such as in step sequencers or arpeggiators).
Instead we represent musical structures using abstract objects
that move, rotate, blink/bang or interact. The musician controls
those objects as if they were parts of an acoustic instrument,
using the mouse, the keyboard or other control devices. We
have created over 15 of these instruments – each exploring new Figure 1: StockSynth. Here the crosshair cursor serves as a
modes of interactivity where some of the unique qualities of the microphone that picks up sounds from the boxes that
computer are utilised in fun, inspirational and innovative ways. represent sound samples. The mic has adjustable scope (the
Qualities such as remembering the musician's actions, following circle). The boxes are moveable and the mic moves by
paths, interaction between agents, generativity, randomness, drawn or automatic trajectories or by dragging it with the
algorithmic calculations and artificial intelligence; all things mouse.
that our beloved acoustic instruments are not very good at.
Over the course of our work, we have developed a loose and 2. A SHORT NOTE ON INSTRUMENTS
informal language for these instruments – a semiotics that "Even simple physical instruments seem to hold more mystery
suggest to the musician what the functionality of each interface in their bodies than the most elaborate computer programs"
element is, and what it signifies in a musical context. Human [10]
Computer Interface (HCI) research [2][3][17][1][6] is usually
con-centrated on the chain of meaning from the software Both acoustic instruments and music software incorporate and
define the limits of what can be expressed with them. There are
special qualities found in both, but the struggle of designing,
Permission to make digital or hard copies of all or part of this work for building and mastering an acoustic instrument is different from
personal or classroom use is granted without fee provided that copies the endeavor of creating musical software. The acoustic
are not made or distributed for profit or commercial advantage and that instrument is made of physical material that defines the
copies bear this notice and the full citation on the first page. To copy behaviour of it in the form of both tangible and aural feedback.
otherwise, or republish, to post on servers or to redistribute to lists, These material properties are external to our thought and are
requires prior specific permission and/or a fee. something that we fight with when we design and learn to play
NIME 06, June 4-8, 2006, Paris, France. instruments. Such features or characteristics of the material
Copyright remains with the author(s).
instrument are not to be found in software. Software is per

162
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France

definition programmed (etymology: "pro" = before, "graphein" If again we talk about the locality where thinking takes place
= written); its functionality is prewritten by a designer or an we have a right to say that this locality is the paper on which
engineer and the decisions taken in the design process become we write or the mouth which speaks. And if we talk of the head
the defining qualities of the software, determining its expressive or the brain as the locality of thought, this is using the 'locality
scope. of thinking' in a different sense. [21]
Different languages are based on different paradigms and lead If here I am attempting to find the "locus" of musical
to different types of approaches to solve a given problem. Those thinking/performing in both acoustic instruments and screen-
who use a particular computer language learn to think in that based digital instruments – a discussion that is much deeper
language and can see problems in terms of how a solution than can be delved into here – it is important to consider the
would look in that language.1 [12] difference in embodiment and incorporated knowledge of the
player in those two types of instruments. When learning an
This is not the place to go into the cognitive processes involved acoustic instrument, the motor memory does most of the job
with learning and playing an instrument. But we are faced with and your learning "happens" as interaction with the body of the
an important question: what material (instruments) is the
instrument. Due to the material qualities of it, one can never
computer musician composing for and where does he or she get master an instrument, it always contains something unexplored,
the ideas from? In other terms: where does the thinking (or some techniques that can be taken further and investigated.
composing) of the computer musician or digital instrument With software however, it is more or less visual and procedural
inventor take place? It happens most likely in the form and memory that is involved, as software doesn't have a material
structure of the programming language in which he or she is body that the musician learns to operate. The only “body” of
working. The environment defines the possibilities and the software is in the form of its interface elements, and they (as
limitations of what can be thought. But what does it mean to opposed to the indicative nature of physical material) are
"learn to think in a language"? What are we gaining and what
simple, contingent and often arbitrary design decisions.3 The
are we sacrificing when we choose an instrument or a "body" of the software has to be created and it does not depend
programming environment? And what are the reasons for some upon any material qualities, but rather the style and history of
people preferring one environment for another? graphical user interface design.

3. HCI AND SEMIOTICS


Designing is essentially a semiotic act. Designing a digital
instrument or programming environment for music is to
structure a system of signs into a coherent whole that
incorporates some compositional ideology (or an effort to
exclude it). The goal is to provide the users with a system in
which they can express themselves and communicate their ideas
in a way that suits their work methods and sometimes provide
new ways of thinking and working. But what kind of a tool is
the computer and what kind of communication are we talking
Figure 2: GrainBox. It can be hard to create interfaces for
about here?
granular synthesis. The GrainBox is a suggestion how to
represent the complex parameters as boxes with X and Y
dimensions in 2D space and with connections to other 3.1 Interaction Paradigms
parameters such as reverb and random functions. We can roughly define three primary interaction paradigms in
computer software as: computer-as-tool, computer-as-partner,
When musicians use software in their work, they have to shape
and computer-as-medium. [6] Different research communities
their work process according to the interface or structure of the
address these paradigms. The HCI field investigates the
software. As with acoustic instruments software defines the
computer-as-tool paradigm but the attention is mainly on how
scope of potential expression. The musician is already tangled
to design understandable and ergonomic software for the user
in a web of structured thinking but the level of freedom or
of the tool. What is lacking is a better understanding of
expressiveness depends on the environment in which he or she
creativity itself and how creative and experimental minds use
working.2 To an extent, the musical thinking takes place at the
software (and often have to misuse it to get their ideas across).
level of the interface elements of the software itself.
We have learned from user feedback that there seems to be a
It is misleading then to talk of thinking as of a 'mental activity'. general need for better sketching environments that can be
We may say that thinking is essentially the activity of operating modified according to the needs of the user. An interesting fact
with signs. This activity is performed by the hand, when we here is that many cutting-edge art works are created by hacking
think by writing; by the mouth and larynx, when we think by or modifying software or simply creating one’s own tools.
speaking; and if we think by imagining signs or pictures, I can There are schools of artists that respond to the limitations of
give you no agent that thinks. If then you say that in such cases commercial software with their own software in the form of
the mind thinks, I would only draw attention to the fact you are software art.4 [11][16]
using a metaphor, that here the mind is an agent in a different
sense from that in which the hand can be said to be the agent in
writing.
3
Often made by the wrong people: an engineer and not an ergonomist;
a graphic designer and not a musician.
1
Try to replace "language" with "instrument" in McCartney's paragraph 4
above – the same applies for musical instruments as well. The www.runme.org repository is an excellent source for information
and examples of what is happening in the field of software art and
2
From this perspective SuperCollider and Pure Data are arguably more generative art. It is closely related to the ReadMe festival, which was
open and free than Logic, Protools or Reason, to name but a few. the first software art festival.

163
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France

3.2 The Semiotics of a Creative Tool phenomena that we can use as source for our interface
The most common of semiotic practises is to look at the metaphors.6
signifying channel from the sender to the receiver through some
medium such as signs, language, text, or film. [5][9] The 4.1 Interaction Models
“work” here is a static construction that doesn't change after it Each of the ixi applications is a prototype or a suggestion and it
has been published or released.5 By contrast, computer-based explores a specific mode of interaction. The whole of our
works are interactive and can be changed or modified after their software can be grouped into a specific kind of interaction
release either by users themselves or by updates. Interaction model: a language, a semiotics or a design ideology that
becomes a new sign-feature.[2] Some studies have been done informs and en-forms the work. An interaction model can be
on this new semiotic quality of the computer [1][2][3][7], but defined as more operational than an interaction paradigm
very few in the field of music software or other creative (computer as tool, partner or medium). [6] It can be evaluated
software. according to the descriptive, the evaluative and the generative
In music software, the user is at the same time the receiver and power of the model. These dimensions of evaluation are all
interpreter of information from the designers of the software important when creating an interaction model. The descriptive
and the sender of information in the form of the music being power is the ability to describe a significant range of existing
composed using the tool. This dual semiotic stance is important interfaces; the evaluative power helps us to assess multiple
in all tools (whether real or virtual) but becomes vital in design alternatives; and the generative power is the ability of
contingently designed tools such as music software. Music the model to inspire and lead designers to create new designs
and solutions.
software is a sign system in its own right, but the important
question here is: which are the relevant layers of signification
and communication and from where do the originate? This can 4.2 Interaction Instruments
be analysed into strata of different practices. The hardware It is the generative aspect of ixi's interaction model that is the
designers, the programmers of the compilers, the language API subject here. Beaudouin-Lafon's definition of instrumental
and the software itself, the designers of the interaction and the interaction [7] is the closest description the author has found
programmers of the interface. A creative tool has history of that relates to our work with ixi software. The interaction
important design decisions all shaping its scope and potential. instrument is a tool that interfaces the user with the object of
This is a complex structure, but the user is faced with the interest. A scrollbar is an example of such instrument as it gives
question: what is the meaning conveyed in the interface? And is the user the ability to change the state/view of the document. A
this system of signification not essentially of compositional pen, brush or a selection tool in a graphics package is also a
nature? Who took those decisions and by which criteria? type of such instrument.

The contingency of design mentioned above in relation to the There are three design principles that define the methodology of
digital medium is one of the most definable characteristic of it. instrumental interaction: reification - the process by which
We don't have this “contingency problem” when designing concepts are turned into objects; polymorphism - the property
acoustic instruments as the properties of the material we work that enables a single command to be applicable to objects of
with leads us in our design: closing a hole in a flute increases different types; reuse - the storing of previous input or output
the wavelength in the resonant tube and the tone deepens; for another use. When an ixi application combines all three
pressing the string against the fingerboard of a guitar – design principles into a successful interface, we have what we
shortening the wavelength – produces a note of higher pitch. could call a semiotic machine. The interface is multifunctional
When designing screen-based computer interfaces we can and can be used in a variety of different contexts.
choose to imitate physical laws as known from the world of
acoustic instruments, we can draw from the reservoir of HCI
4.3 The Terminology of ixi’s semantics
techniques or we can design something entirely new. It is here As explained in earlier papers, [14][15] most of the ixi software
that interface design, the interaction design, and mapping applications are controllers that send and receive OSC (Open
becomes very important factor in the creation of interesting Sound Control) [23] information to sound engines written in
screen-based instruments for the computer. other environments such as SuperCollider [12] or Pure Data
[17]. We separate the interface from the sound engine in order
to be able to reuse the control structures of the abstract interface
4. INTERFACE ELEMENTS IN IXI in other contexts, for example allowing a sequencing interface
Most modern operating systems are graphical or allow for a to control parameters in synthesis if the user configures it so.
graphical front end. The WIMP (Window, Icon, Menu, Pointer) These controllers are all made from a common ideology or an
interface [4] has become a standard practice and we have interaction model that we see as a semiotic system.
become used to the direct manipulation [20] of graphical
objects. The traditional method is to translate work practices In our work with ixi software, the fundamental attention has
from the real world into the realm of the computer, and thus we been on the interaction design and not the interface design. The
get the folders, the documents, the desktop and the trash. In design of interface elements is often highly (but not
music applications we get representations of keyboards, buttons exclusively) aesthetic and depending on taste, whereas the
knobs and sliders, rack effect units and cables. This is also interaction design deals with the fundamental structure and
suitable where the aim is to translate studio work practices into ergonomic idea of the software. In the example of SpinDrum
the virtual studio. But when we are creating new instruments [14], for example, the wheels contain pedals controlling beats
using the new signal processing capabilities and artificial per cycle, the size of the wheel signifies the volume and the
intelligence of the computer, there might not exist any physical
6
As we can derive from the Peircian semiotics, an interface object can
be represented in various ways: iconically (where the representation
5
Post-structuralist thought has rightly pointed out how interpretations is based on resemblance to an object), indexically (where the
of the work change in different times and cultures, but the work itself representation is influenced by an object) or symbolically (where the
doesn't change - only people's interpretation and reception of it. representation is based on convention).

164
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France

colour accounts for which sound is attached to the object. Here The interface units that we call actors - such as a picker, a
the interaction design clearly affects the interface design (size, spindrum or a virus - are not instruments that the musician uses
number of pedals, colour), but the shape of the pedals (whether for some task and then chooses another instrument for the next
a square, a circle or a triangle) is simply an aesthetic decision task. The actors in the ixi software applications are put into use
and of little general importance. at some point in time and they continue working in a temporal
flow (rotating, moving through a trajectory or interacting) until
the musician decides to stop or pause their activities.

4.3.2 Context
All actors perform their task in a context. They are graphically
represented in a two- or three-dimensional space on the screen
and their location might typically influence their properties. The
actors move, rotate or blink in this space and are therefore both
spatially and temporally active units. The space can have
qualities such as temperature, gravity, brightness, etc. which are
all qualities that could affect the actor’s behaviour or it can
contain other actors of different type that influence the
behaviour of the message sending actors. Feedback from users
of ixi software has shown us that people find the metaphor of an
actor presented in time and space useful to represent musical
actions and ideas. What the feedback also shows is that people
intuitively understand the metaphor of having actors on a stage
that perform some tasks that they – the directors of the piece –
are controlling.
Figure 3: SpinDrum. Each wheel contains from 1 to 10
pedals. The wheels rotate in various speeds, and when a
pedal hits top position (12 o’clock) it triggers the sample or
sends out OSC info to the soundengine. The X and Y
location of the wheels can affect parameters such as pitch
and panning.

4.3.1 Actors
The ixi interfaces are pattern generating machines with cogs
and bolts of varied significance. To sum up the basic design
ideas of ixi software we could say that it was the reification of
musical ideas into abstract graphical objects as control
mechanisms that act in time.7 We call these abstract objects
actors,8 as they are graphical representations of temporal
processes that act, enact and react to the user, to each other or
the system itself in a complex network of properties, relations
and teleology (desired states or end goals). Beaudouin-Lafon
calls graphical interface tools "interaction instruments", but we
cannot use that metaphor as an ixi application is a musical
instrument on it's own but also because of the different nature Figure 4: Connector. This software uses generative
of the interface units of ixi software. The feature under algorithms to decide where actors travel within a network of
discussion here is the difference musical applications have from connectors. There are probability charts that decide the
the ergonomically "single-threaded" or serial task-processing next move of an actor and when it enters a connector it
applications used for painting, text editing, programming, video triggers a MIDI note and/or a sound sample that is a
editing or in architecture. In contrast to these applications, a property of the connector.
music application is multi-threaded or parallel, i.e. there are
many processes, streams, layers or channels that run 4.3.3 Network
concurrently in every composition or performance, all When talking about the context and the environment of these
controlled by the user, but, in the case of ixi, usually only one at actors, we must note the fact that the interface elements are not
a time.9-10 the only actors in the context of an ixi instrument: the user is
one actor, the control hardware (a mouse, keyboard, sensor or
controller), the soundcard, the speakers and other
communication such as virtual audio cables, MIDI or OSC
7
Musical idea here meaning any pattern generating structure. messages. The whole context of musical action and reaction is
8
the space of the actor, a space in which the heterogeneous
We thought about calling the active interface elements agents but it network of musical performance takes place. The meaning of
was too confusing as the term has very strong connotations in
the actor is its functionality within the control context and the
computer science, especially within the field of artificial intelligence.
mapping context. The actor has as many dimensions as it has
9
Another fact that divides those types of software is that the painting numbers of control parameters and connections for receiving or
software, the video software or the 3D package are not packages that sending messages.
are used in live performance.
10
This is of course what people are working with in the research field
often known as NIME (New Interfaces for Musical Expression on the computer allows for multi-parameter mapping to one sound-
engine.
www.nime.org) where building physical interfaces to control sound

165
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France

foundation for what can be expressed with the instrument.


Whereas the expressive possibilities of an acoustic instrument
are highly dependent upon the physical material it is built out of
(wood, iron, strings, etc.), the situation is very different when
we create digital instruments, especially screen-based. We have
shown some examples of the semiotic system we are working
towards in our work with ixi software and suggested a
terminology of actors, context and network to better understand
and modularise the interaction and interface design of virtual
instruments. We have also illustrated how an interface can have
its own meaning system independent of its relationship to the
sound-engine, where the interactive patterns of an instrument
can be mapped in many different ways onto the parameters of
the sound-engine.

6. FUTURE WORK
Future plans involve exploring the dimensional spaces of the
screen-based actors as the interface for musical interaction. The
Figure 5: ParaSpace. This application interfaces with audio computer is becoming quite good at imitating the properties of
effects written in SuperCollider (but can talk to any acoustic instruments but it excels as an interesting instrument
software that supports OSC). Each audio effect has variable on its own where interaction is designed from the premise of
number of parameters and they are represented as small the qualities of the computer and not by imitation of real world
boxes in the control interface of ParaSpace. The point here objects.
is that the parameters interact on the interface level with
automation, artificial life and artificial intelligence. Our work involves experimenting in creating semiotic systems
that can be taken further and extended into new dialects and
To clarify this idea of actors being all the elements that affect systems of meaning. This system is not exclusive to one type of
the interaction in an instrument, let us have a look at the applications, but can rather be seen as a semiotic toolbox from
software Connector. Here actors move in a system of which elements can be taken and reused in new contexts.
connectors (a plumbing-like system) and trigger sound samples Computer music software is a highly interesting area in the field
or MIDI notes that are properties of the connectors. The of HCI as it is used in live performances and should contain
connectors are actors themselves as they are the receivers of an depth that can be explored and practiced, thus allowing for
action and contain the information that yields the sound. It is musical virtuosity. In semiotic interfaces such as the ixi
through the interaction of all the actors and their properties that software there is always the filament of concurrent mappings or
interaction takes place – interaction between elements within parallel streams of musical events happening at any one time.
the instrument and also with the musician using the instrument The temporal aspect of computer music software makes it also
– and this interaction is simply the automation that controls the quite unique in relation to other types of software. Facing these
various parts of the music set into motion. In StockSynth incredible demands and challenges of music software we feel
(Figure 1) the microphone is one such actor (with its properties that we are just starting our journey into the possibilities of new
of trajectory and scope) that interacts with the sound objects meaning systems, metaphors, pattern generators and control of
that contain the information about the sound and its properties. synthesis techniques through the creation of semiotic machines
in the form of interfaces.
4.3.4 Semiotic elements and mapping
The actors and the contexts in which they function are all 7. ACKNOWLEDGEMENTS
elements in a semiotic language. This language has dialects or The ixi software project (www.ixi-software.net) is a
rather idiolects (each application is unique) where the meaning collaboration between Enrike Hurtado Mendieta, Thor
of an element can change as in Wittgenstein's concept of the Magnusson and various musicians and artists that are involved
usage as the word’s meaning [15][22] or as in the Saussurian with our work. I would like to thank Chris Thornton of
conception of the lack of natural connection between a signifier University of Sussex, Marcelo Wanderley and the people at the
and the signified.11 [19] We provide a semiotics or suggest IDMIL Lab at McGill University, Montreal and Julian
language games where the behaviour of an actor maps onto Rohrhuber at the University of Cologne for good discussions
some parameters in a sound engine. For example, vertical and important feedback for this paper.
location of an actor could signify the pitch of a tone or playback
rate of a sample. Size could mean amplitude, rotation 8. REFERENCES
triggering, and direction could mean a tendency for some
action. But, it could also signify something entirely different as
the controllers are open and it is up to the musician to map the [1] Andersen, Peter, B. & May, Michael. "Instrument
actor's behaviour onto a parameter in the sound engine. Semiotics" in Information, organisation and
technology. Studies in organisational semiotics. (eds.
5. CONCLUSION Liu, Kecheng; Clarke, Rodney J.; Andersen, Peter B.;
This paper has tried to show how the materials we work with Stamper, Ronald K.). Kluwer:
when we design instruments (digital or acoustic) are the Boston/Dordrecht/London. 2001: 271-298.
[2] Andersen, Peter, B. "What semiotics can and cannot do
11
For Saussure, the meaning of signs is relational rather than for HCI" in Knowledge Based Systems. Elsevier: 2001.
referential, i.e. the meaning lies in their systematic relation to [3] Andersen, Peter, B. "Computer semiotics" in
other signs in the semiotic system and not by direct reference Scandinavian Journal of Information systems. Vol. 4:
to material things, for example. 3-30. Copenhagen, 1992.

166
Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06), Paris, France

[4] Apple Computer Inc. Human Interface Guidelines: The [15] Magnusson, Thor. "ixi software: Open Controllers for
Apple Desktop Interface. Boston: Addison-Wesley, 1987. Open Source Software" in Proceedings of ICMC 2005.
Barcelona: University of Pompeu Fabra, 2005.
[5] Barthes, Roland. Mythologies. London: Paladin, 1972.
[16] Magnusson, Thor. “Processor Art: Currents in the Process
[6] Beaudouin-Lafon, Michel. "Designing Interaction, not Orientated Works of Generative and Software Art”,
Interfaces" in AVI Proceedings, ACM: Gallipoli, 2004
Copenhagen: 2002. www.ixi-software.net/thor
[7] Beaudouin-Lafon, Michel. "Instrumental Interaction:
[17] Nadin, Mihai. “Interface Design: A Semiotic Paradigm” in
An Interaction Model for Designing Post-WIMP User
Semiotica 69-3/4. Amsterdam: Mouton De Groyer, 1988.
Interfaces" in Proceedings of ACM Human Factors in
Computing systems (CHI 2000), The Hague: ACM [18] Puckette, Miller. "Pure Data." in Proceedings of
Press, 2000. International Computer Music Conference. San Francisco:
International Computer Music Association, pp. 269-272,
[8] Beaudouin-Lafon, Michel. "Reification, Polymorphism
and Reuse: Three Principles for Designing Visual 1996.
Interfaces" in in AVI Proceedings, ACM: Palermo, 2000 [19] Saussure, Ferdinand de. Course in General Linguistics.
[9] Eco, Umberto. A Theory of Semiotics. London: Macmillan, London: Duckworth, 1983. p. 121.
1976. [20] Shneiderman, Ben. "Direct manipulation: a step beyond
[10] Edens, Aden. Sound Ideas: Music, Machines, and programming languages," IEEE Computer 16(8) (August
Experience. Minneapolis: University of Minnesoda Press, 1983), 57-69
2005. [21] Wittgenstein, Ludwig. The Blue And Brown Books,
Oxford: Blackwell, 1993. p. 6.
[11] Gohlke, Gerrit. (ed.) Software Art - A Reportage about
Source Code. Berlin: The Media Arts Lab, 2003. [22] Wittgenstein, Ludwig. Philosophical Investigations,
Oxford: Blackwell, 1994.
[12] McCartney, James. "A Few Quick Notes on
Opportunities and Pitfalls of the Application of [23] Wright, M; Freed, A; Lee, A; Madden, A. and Momeni,
Computers in Art and Music" in Ars Electronica, 2003. A. "Open Sound Control: State of the Art 2003" in
Proceedings of NIME 2003. Montreal, Canada. 2003.
[13] McCartney, James. "Rethinking the Computer Music
Language: SuperCollider" in Computer Music Journal,
26:4, pp. 61–68, Winter, 2002.
[14] Magnusson, Thor. "ixi software: The Interface as
Instrument" in Proceedings of NIME 2005. Vancouver:
University of British Columbia, 2005.

167

You might also like