LIVE PATCH BUILDING AND PROGRAMMING
There is a continuum of changes of state possible in computer music performance that stretches from simple play and stop buttons to the building of a low-level DSP engine from scratch.
On the level of least interest to this paper is the manipulation of graphical interfaces built for you by a third party.
In graphical programming languages, where one has access to an infinite grammar of possible constructions, the capacity for live reigning and construction is more readily apparent.
Whether in the process of creation and experimentation or to correct an error spotted at the last minute or temporarily bypass some processing alley, Reactor, PD or MAX/MSP users have edited the structure of the signal graph as their patches run.
Moving beyond graphical programming languages to the command-line antics of interpreted text-based programming languages, the abstract potential of the system, along with its difficulty of use, continues to grow.
It is the domain of text-based, scripted, and command-line control of audio that fits most with the investigations in this article.
Whilst it is perfectly possible to use a cumbersome C compiler, the preferred option for live coding is that of interpreted scripting languages, giving an immediate
Code and run aesthetic.
We do not formally set out in this article the choice between scripting languages like Perl, Ruby, or Supercollider, believing that decision to be a matter for the individual composer/programmer. Yet there is undoubtedly a sense in which the language can influence one’s frame of mind, though we do not attempt anything so ambitious as to track the influence on the artistic expression of a language’s representational mindset. No program can be free of a priori strictures and assumptions and this has been discussed in detail in the literature (Flores and Wino grade 1995). There is a cult to coding itself, evidenced even in popular culture by movies about hackers and virtual reality or designer’s appropriation of computer iconography for record sleeves or T-shirts, and we may even tackle the issue of anesthetic to generative coding (Cox, McLean and Ward 2001).
We are not advocating a situation in which the programmer/composer rewrites man-years’ worth of support libraries.
It is hard to imagine beginning entirely from scratch to write a driver or DSP engine unless you’re working in the background in a venue over a number of nights, before finally emerging with a perfect heartfelt bleep on Sunday evening.
Some custom coders will want to write their own libraries for standard DSP functions well before they get on with the compositional algorithms at a gig.
Many will work with an established language for computer music, which comes ready specialized to the audio domain.
There have been many computer music languages over the years (Loy 1989; Roads 1996: 781–818; Lyons 2002) and we shall not enter into a discussion of their relative merits for live coding, most honestly because many of them are strictly non-real-time, and others remain virgin
Territory for live coding experiments,
At least to the authors’ knowledge. Our two test cases,
However, will contrast the use of general programming languages and work with the audio-specific programming language Supercollider
THE PRACTICE OF LIVE CODING
There are readers who are no doubt wondering why they would ever wish to attempt live coding in