Thursday, April 10, 2008

Performance and the Alphabet of the Brain

I recently posted on newmusicbox on music's slow, idealogical shift away from virtuosic performance. I believe we're in the middle of a long transition away from affirmed virtuosity; part of this shift is caused by a parallel shift in compositional ideals: the 'layman-as-artist,' repetition, computer software, etc. Another angle from which to examine future virtuosity is that of human-machine interaction. Perhaps this interaction will result in 'the return of the virtuosic.'

Duke University professor Miguel Nicolelis placed electrodes in a monkey's brain and represented the electric currents produced from its thoughts in computer code - the 'alphabet of the brain.' He later programmed a computer to use this code to control a robotic arm that moved just like the monkey's real arm. Then, he mapped the monkey's real-time thoughts onto the robotic arm's motion, so a live organism was controlling a machine.

Now here's the craziest part: the monkey realized that it didn't have to move its own arm in order to move the robotic arm. It continued to move the mechanical arm with its brain while keeping its own arm stationary. The monkey had essentially become a computer, motionlessly controlling a machine's movements by means of self-produced code. [The monkey had actually been trained to use a joystick to play a video game, so at this stage, the monkey was playing a video game with its mind only.]

Scientists have also approached the organism-machine connection the other way around. One example is the 'roborat,' a rat similarly wired but controlled remotely by a scientist.

What would improved human-machine interfacing mean for music? It would certainly cause a reevaluation of virtuosity. If we could power machines to play instruments with our brains in ways our physical bodies couldn't, why not? If machines could power our bodies to play what our own minds could not initiate, why not? Ensemble Robot at MIT has been programming robots to play acoustic instruments for a few years now, which is really cool, but as far as I know, they haven't done anything with human brain impulses. Of course acoustic music produced by machines goes back to player pianos/pianolas, maybe even further. But maybe future advancements in electro-brain technology will start a trend back to the virtuosic ideal.

Check out this article on how a brain in a petri dish controlled a flight simulator.


Matthew said...

This is an interesting post. One thing that occurs to me is the potential limits of human-machine interaction. Let's return to the monkey/mechanical arm example: the monkey can control the mechanized arm because it already knows how to move its own arm--thus it uses its brain to "think" a knowable action, a familiar move. I could see a machine helping me to create musical materials I already know how to play, e.g., I could think through the moves necessary to play halfway decent bop lines on the trumpet. Could I think the musical gestures of a trumpet virtuoso...I'm not sure, because I've never actually been able to play those gestures on my own. It seems that this robotic/computerized/bionic technology maps "knowable" actions, initiated as thoughts, onto external machines. But could it convert impulses for which the thinker does not have a fully worked-out "code" (as you aptly put it) into a code which could then be realized as action by the robotic apparatus?

Just wondering...

Mr. Bacon said...

That's an interesting point. I guess I didn't think about it because in my experience, especially with improv, I usually find myself 'thinking' of licks my fingers can't actually execute because I just don't shed enough. I might know I want to play 4 downward thirds in series, each starting a major second above the one before. But sometimes my fingers just can't pull it off - I don't know I'd be able to provide a machine with enough code to do it, though. On the other side, a computer could probably feed electrodes attached to my brain with all kinds of virtuosic licks and I'd play them, as long as my brain could read the code accurately (like the mouse) and my fingers/breath could technically keep pace.