Found poem. Reported in All Things Considered on December 22, 2009.
What
was billed
as the first intercontinental musical
interaction
between humans
and robots took place the
weekend of Dec. 17.
It involved humans in Japan
using an application called ZoozBeat on their
iPhones
and
a robot named Shimon in Atlanta.
According
to its makers,
unlike other robots that can play music,
Shimon
is perceptual.
The
robot can listen
to
what is played, analyze it
and then improvise.
And
it has been taught to improvise like some jazz masters.
Gil
Weinberg of Georgia Tech's music technology
program recently spoke to NPR's Robert Siegel
from Japan,
where he witnessed the historic interaction.
Weinberg says the result is music meant to
inspire people — not an effort
to turn our music-making over to robots.
"The
whole idea is to use computer algorithms
to create music in ways that humans will
never create," Weinberg says. "Our
motto is, 'Listen like a human,
but improvise like a machine.' "
Weinberg
programmed Shimon
to play like Thelonious Monk.
He says that, though he and his team
were
trying to teach the robot to play like a machine,
they first had to teach it how a human plays.
To do that, they used statistics
and analysis of Monk's improvisation.
Once
they had a statistical model
of
the pianist, they could program the robot
to improvise in that model.
Weinberg
says the robot
won't play everything exactly like the bebop
pianist —
or
any other jazz master —
would,
though he says,
"It
probably will keep the nature
and the character of [the musician's]
style."
"It's
difficult to predict exactly
what they would do in every single moment in
time,"
he says.
"But our algorithm pretty much
looks at the past several notes that it plays
and, based on that, it sees what is the
probability
of the next note to be, based on all of this
analysis
of a large corpus of transcribed
improvisation."
Some
musicians are harder to program than others.
Weinberg
says Ornette Coleman
would
require a much larger body
of transcribed work than Monk did.
"In
a sense, it kind of reduces music
to numbers and statistics," Weinberg
says.
Given enough tweaking to the algorithms
that the program uses, he says
he thinks they'll be able to create
something "very similar to the jazz
master."
But
Weinberg says
he doesn't think the robot
should
try to play just like a human.
"In
all the emotional
and expressive energy,
I don't think a robot can capture [it],"
Weinberg
says.
Maybe someday a computer program could,
but at least right now, Weinberg says,
"I don't think we have the math for that.
We
have some math to get the notes
and
the rhythm and the scales.
Whether
this can capture
the
genius of Thelonious Monk,
I
hope not.
But maybe."
No comments:
Post a Comment