This article was taken from the November 2012 issue of Wired magazine. Be the first to read Wired's articles in print before they're posted online, and get your hands on loads of additional content by <span class="s1">subscribing online.
In the autumn of 1978, Giuseppe di Giogno, a nuclear physicist, abandoned his research on matter-antimatter reactions to start making analogue synthesisers in a basement. His dream was to make a studio-sized synth and cover the walls and ceilings in a thousand dials: a panoply of sound sources that he could tune to create a massive noise. Di Giogno joined the faculty of the newly opened Institut de Recherche et Coordination Acoustique/Musique (IRCAM) in Paris. He soon realised the studio-synthesiser would be impossible using analogue synths or computers (then too slow). So he invented the first digital synthesiser, which eventually became the 4X system.
The problem was, no one at IRCAM knew what to do with the 4X. "That's when I found my calling," says Tod Machover. He had just arrived at IRCAM, fresh out of the elite Juilliard School of Music in New York. "I said, 'Wow, this is an instrument. And not only can you turn this programmable machine into any instrument, you can design the studio you need to make a piece. You can make it like a sculptor's studio where the sound is like clay, and whatever the basic material is for a piece, you can design it for this set of experiments." A year later, Light, the first piece composed on a digital synth, premiered in Metz. The next year, Machover was, at 27, director of music research at IRCAM.
Since then, he has created robotic operas, sci-fi compositions, software that lets anyone compose music, a musical exercise that can diagnose illness and the technology behind Guitar Hero, as well as augmenting instruments for Yo-Yo Ma and Prince. He was one of the founding faculty members of the Media Lab in 1985 and has been there ever since, heading the Opera of the Future group. Machover has been finding the edges of music and technology for more than 30 years. Now, the 58-year-old wants everyone else to do the same.
His current project, A Toronto Symphony, is a collaboration with the 2.7 million residents of the city to compose a new work for symphony orchestra. His idea is to create "a new musical ecology", with input from experts and amateurs for the benefit of both. Machover started the project by publishing "launch music" -- a series of chords that serve as a "genetic code" for the project -- then invited locals to download the musical score, play around with it and send back their own variations. The piece will premiere in March 2013, with Machover arranging the various effluvia into a final composition. "It's not crowdsourcing, it's mass collaboration," he says. "The problem with crowdsourcing is that you can't tell where a particular sound ends up. I'm trying to build a Media Lab-like group with 50,000 people. Everyone has a role, but I guide it. It's not random. And I don't think anything quite like this has been done before, so we're inventing it as we go along."
A Toronto Symphony is ambitious, but it's only a midpoint on a trajectory that Machover has been following since the 80s and which extends into the next half-century: "Excellence and democracy have been obsessions from the beginning," the composer says.
Machover was born in 1953 in Mount Vernon, New York. His mother was a piano teacher and his father worked in the emerging field of computer graphics, but had "a sense of all things pop culture".
Sgt Pepper's Lonely Hearts Club Band, in 1967, was a big influence. "The Beatles saw the art in multitrack recording, in remixing sources from all over the place and coming up with something new. That's when I started to wire my cello.
I started thinking, how can you have the best of both worlds -- the technology and the spontaneity?"
For his 1987 opera Valis, based on Philip K Dick's novel, he wanted to "have an orchestra that could also rehearse like a rock band". Machover ended up with two musicians, on keyboard and percussion. "And that's when we invented hyper-instruments -- we've got two performers: how do we tell what they're playing, where they are in the music, what they're improvising, what the expression is, what the gesture is, and how do we use all that to add these layers?" Hyperinstruments --
Machover's first project at MIT -- turned out to be "a way to perform Sgt Pepper live. You needed an environment that understood what your intention was. And for that you needed new interfaces and new analysis software." The hypercello had sensors across its body and along the bow. By measuring the pressure, speed and angle of the player, it could aim to interpret the nuances and emotion of the player to create new, digital sounds.
Machover created the first hyperinstrument for Yo-Yo Ma. "We actually had a little accident," he says. An antenna measured the electricity in the air and fed this data into the software. But when Ma's hand went near the antenna, his body absorbed the electricity. "We'd basically invented a digital theremin." Machover reversed the process and turned the body into the instrument, creating the Sensor Chair. Its seat was a piece of metal: "When your butt touches the metal, the electricity goes through your body... It turned out that the physical techniques and software could measure natural gestures that anyone could make." People without musical training could make sophisticated music, just by moving their body (two of Machover's graduate students spun off much of that research into Guitar Hero and Rock Band). "I realised you could make a bridge to people who love music but are very passive."
So, in 1996, Machover came up with The Brain Opera, for which he built an orchestra out of hyperinstruments for the public to play. "It was virtuosity for non-professionals," he says. "The next step before mass collaboration." Hundreds of instruments littered the hall of [New York's] Lincoln Center. People wandered around playing them, with others participating online, creating sounds that were edited together for a stage performance. His Hyperscore project took this another step further, creating software that allowed non-musicians to compose entire symphonies by abandoning traditional music notation (you can download it at hyperscore.wordpress.com).
The mass collaboration of A Toronto Symphony is the next step, but it's nowhere near the end. "There are all kinds of things we haven't even begun to imagine that will be valuable in music. There'll always be a place for the perfectly crafted song, the definitive performance. But a large part of music is going to be some kind of collaboration. Up until today, we judged music by how a combination of sounds appeals to the most number of people.
One of the most important new branches is going to be to personalise music so that there is maximum impact for you, your genetics, your physiology, your psychology. Depending on how you're feeling, it plays differently. It may play on its own, but I also think there's going to be a role in between the basic music materials and the listener -- somebody in the middle fine-tuning it."
Machover is now interested in measuring responses to music that occur within the brain and body (he's working with a tissue specialist at MIT, along with MIT's Buddhist chaplain, to study how vibration travels through the human physique). "We're working on vocal techniques that everybody can do to send vibrations to different parts of your body... We're designing an experience where you start out singing in a series of private rooms and end up in room with ten people."
Machover "doesn't know how it will shake out. This is a 50- to 100-year progression. Something big could happen.
The real question is how to make music itself -- something grand and good -- and how people participate in that, without making it cheesy."
This article was originally published by WIRED UK