George Lewis: Voyaging Towards Forager
3 July 2023
Professor George Lewis (RNCM PRiSM Artist in Residence, Chair of PRiSM International Advisory Board & Edwin H. Case Professor of American Music, Composition & Historical Musicology, Columbia University) is known internationally as one of the pioneers in the field of computer-driven, interactive compositions and performance practice.
A long-time researcher, performer-composer and programmer, Lewis first developed the Voyager system between 1985 and 1987 at the Studio for Elektro-Instrumemtale Muziek (STEIM) in Amsterdam, building on the earlier Rainbow Family system he developed and premiered at IRCAM (Institute for Research and Coordination in Acoustics/Music) in Paris between 1982 and 1984.
During his RNCM PRiSM Residency 2021-22, Lewis worked closely with the whole PRiSM team, RNCM student performers, and New York-based technologist Damon Holzborn on extending Voyager’s listening capabilities. The work resulted in a newly developed musical gesture recognising software (to be used either as a standalone software tool or as an add-on to the existing Voyager framework) as well as a new piece of composition entitled Forager (2022) for quintet and interactive pianist, which premiered at the RNCM in November 2022.
The project was recently featured in an New York Times article written by composer & music scholar Garrett Schumann. Revolving around Lewis’ collaboration with PRiSM, the work of PRiSM Associate Dr Robert Laidlow (Jesus College, University of Oxford), as well as many other creative research projects concerning classical music & artificial intelligence world-wide, the article provides much insightful debate on an ethical and independent utilisation of generative machine-learning tools in musical practices.
Read the New York Times article here: What Happens When A.I. Enters the Concert Hall
In the short film featured in this PRiSM Blog, Lewis gives an in-depth introduction to the background of Voyager and what led him to collaborate with PRiSM to ‘voyage’ even further. The Blog also features the recording of Forager (2022), taken from the world premiere event at the RNCM in Manchester on 25 November 2022.
Voyaging Towards Forager
By George Lewis
Produced by Zakiya Leeming & Transcribed by Bofan Ma
I am a long-time researcher, performer-composer, and programmer in certain kinds of AI-related Enterprises, among other things. My interest in real-time experience has always been at the heart of my interest in machine intelligence: AI and machine learning. That has been true ever since I first decided that I could think about trying to make machines that could ‘improvise’.
I remember being at a party where a guy came up and said: “Well…um, do you think, George, you could get a computer to improvise?” At that time, I had not even programmed a computer, but I started telling him how easy it was. And all he said was, “Well, you should try it”, and then he went off. Someone then said to me: “You were just talking to Marvin Minsky! What were you guys talking about?”, to which I said: “…I was?!”
But then it turned out that that was part of a community, because – in fact – he (Minsky) was thinking about those questions himself. It was a great inspiration to be around somebody – over the years – who thought deeply about improvisation with artificial intelligence as well as its importance.
For me, this kind of work has larger societal implications. For example, we have the malign face recognition doodads that think all black people are criminals; or those self-driving cars that can’t recognise certain things. These are real-time improvisation problems. And I think that maybe the people who build these things could benefit from talking to artists – talking to improvisers about what they think.
But any kind of driving is basically an improvisation problem. And thus improvisation has been, for me, at the centre of being able to think about machine learning in all aspects. And that is why a place like PRiSM – open to that kind of thinking – becomes a place where we can really think on a high level about it.
Since the late 1980s, I have been working on a project called Voyager, following on from an even earlier piece called Rainbow Family. The idea was to have a kind of virtual orchestra – or virtual performer – that ‘improvises’. That said, when generating music, the machine does not really know if it is improvising or composing, and that is really a political and ideological decision. But if I make it an ‘improviser’, it means that I am obliged to have a response to real-time conditions, rather than it being something that does not respond.
Voyaging Towards Forager
I have been working with PRiSM for about a year and a half. Through this project I wanted to extend the capabilities of Voyager. Voyager can already do a lot. It can play with bands and orchestras and make a good account of itself. While it can hold its own already, there are still certain things which it doesn’t hear well. And so what I’m doing at PRiSM is to make a kind of (enhanced) real-time recogniser, similar to that face recognition technology or that which sits on self-driving cars.
So the purpose of this is to be able to hear certain (musical) gestures. These are things that are harder to hear with Voyager’s current recognisers. And to hear them is what the extended system is designed to ameliorate. We have thus set up these gestures in advance and trained this extended system on them. In performance, as a result, when Voyager hears these gestures, it can make (real-time) responses.
The new work is called Forager because that is basically what the machine is doing. Now that Voyager has this interesting front end extension (that can recognise certain gestures in a way that its own recognisers cannot do), it is enabled to go through the real-time musical environment of people improvising – looking for gems; looking for gestures; looking for things to sort of hang its musical hat on; looking for things to which this larger Voyager piano system can respond. So it allows the system to be more integrated with what the other musicians are doing: they are also improvising and foraging. Improvising and foraging have a certain real-time connection.
In fact, I think foraging is a form of improvisation.
They – the five performers and Voyager – are all performing together. At the beginning of the piece, there is some written music for the instrumentalists to perform. This music embodies the gestures that the system is supposedly recognising.
Then during the middle of the piece, people start using these gestures to make their performances/improvisation. They don’t have to play them – they don’t have to make up things out of nothing. They can use what I’m calling databases.
Everyone has a database page which represents all the gestures and verses that the system is trained to recognise on. Of course the performers are not limited to these gestures – they can do whatever they like. When the system recognises one of these gestures and/or when other performers recognise some of these gestures – they can also make responses towards each other.
The above is, in a nutshell, how this piece works. We don’t know where it is going to go. Lots of things can happen. But we are hoping, as with a lot of these pieces that involve improvisation, that the introduction of a set of gestures that they (the players) can have in common allows for a sense of unity of conception to emerge among them.
About the Residency
When I first heard about PRiSM from Emily Howard, I thought this would be an ideal place to realise my new work and also other kinds of new works: in particular the work we are doing with Julia Hyland Bruno on Virtual Birds.
It is a sound project rather than a music project. By using machine learning techniques that my associate Damon Holzborn and I have already built – a virtual zebra finch that is able to generate samples in real-time that sound like a real bird, we are now hoping to figure out whether the birds think that it’s a real bird. And for that we need some sort of machine recognising system. It won’t help us to have something that is just to record a corpus of bird sounds and then play it back. What we need is something where the birds can dialogue – because that’s what they do.
So I think that PRiSM is the ideal place to do something like that as well, because people here are thinking about music in the most expansive way. I have learned a great deal from working with Chris Melen, David De Roure, Bofan Ma, and of course Emily Howard as well. It seems to me that this is an ideal spot for musicians, artists, composers and technologists to come together. And its being part of the RNCM is most important because that is the framework in which the whole thing sits. You get a larger community of artists working together to do Innovative and important things that relate to sound, music, and society.
I have a lot of plans for PRiSM. I’d really like to continue my association with PRiSM, because there is really nowhere else like it. I feel that the openness, the brain trust, the association with the RNCM, and the possibility of working with live musicians, as well as the technological facilities and the congeniality of the place – all end up being very important for me. I feel this is the kind of environment we need in the future to do things successfully.
And I hope that it happens.
This work is supported by PRiSM, the Centre for Practice & Research in Science & Music at the Royal Northern College of Music, funded by the Research England fund Expanding Excellence in England (E3).
Forager Project Team:
Professor George Lewis (composer & lead researcher)
Professor Emily Howard (director PRiSM & project supervisor)
Professor David De Roure (computer scientist & research software engineer)
Dr Christopher Melen (research software engineer)
Dr Damon Holzborn (research software engineer)
Dr Bofan Ma (research assistant & project manager)
Forager Copyright Info:
George Lewis (2022): Forager
© 2022 by C. F. Peters Corporation, New York
Licensed courtesy of Peters Edition Limited, London.
All rights reserved. For non-commercial use for PRiSM only.