Monkey think, robot do

Last year, brain signals from an owl monkey in a North Carolina laboratory were transmitted over the Internet to Massachusetts, where the impulses were used to manipulate a robot arm.

That long reach by one small primate, the result of work done by a collaboration of neurobiologists and computer scientists, may signal dramatic possibilities for the human/machine interface.

One primary goal of Miguel Nicolelis, an associate professor of neurobiology at Duke University; Mandayam Srinivasan, director of the Laboratory for Human and Machine Haptics at MIT; John Chapin, a researcher at the State University of New York Health Science Center at Brooklyn; and their teams is to develop technology that will enable people to control prosthetic devices with their minds and the aid of some computational device. Their work may also have wider application, giving embedded processing an entirely new meaning.

“It was an amazing sight to see the robot in my lab move, knowing that it was being driven by signals from a monkey brain at Duke,” said Srinivasan. “It was as if the monkey had a 600-mile-long virtual arm.”

In the experiment, electrical signals were picked up by arrays of as many as 96 tiny electrodes implanted in the cerebral cortexes of the two monkeys. The signals were processed to find signatures representing the initiation of movement. As the transmitted signals were detected in real time, 3D instructions were given to a robotic arm to move accordingly.

When the instructions were sent over the Internet from Duke to MIT the robotic arm faithfully produced the correct motions, despite the possibility for packet delay.

Trained Monkeys

For two years, the owl monkeys had been trained to reproduce small reaching tasks so that their brain activity during specific motions could be studied and a variety of predictive algorithms examined. A simple, linear algorithm was found to work as well as more complicated ones. By using a weighted average of the number of impulses measured by each electrode over the previous second, Nicolelis’ team was able to find characteristic signatures for the initiation of motion.

Continually updating their averages and the weights used to produce them, the team was able to correctly predict the speed and direction of the monkeys’ arms.

Nicolelis and his team were surprised by their results. “We found two amazing things,” Nicolelis said. “One is that the brain signals denoting hand trajectory show up simultaneously in all the cortical areas we measured.”

That means that not only the motor cortex, which is thought to be mostly responsible for motion, but also other regions of the brain are involved in the initiation of movement.

“The second remarkable thing,” Nicolelis said, “is that the functional unit in such processing does not seem to be a single neuron. Even the best single-neuron predictor in our samples still could not perform as well as an analysis of a population of neurons.”

Another significant result was found in a prior study with rats, whose brains were also implanted with hollow electrodes, no larger in diameter than a human hair.

“We found that rats were able to use their neuronal population activity to control a robot arm, which they used to bring water to their mouths,” Chapin said. “At the beginning of the experiments, the animals had to press down a lever to generate the brain activity needed to move the robot arm. After continued training, however, their lever movements diminished while their brain activity remained the same.”

In other words, the rats were able to think about wanting water and pressing the lever in order to induce a machine to deliver it to them.

In the same way that sound waves can be characterized so that an automatic attendant can understand a bit of human conversation, it may be possible for a robot with a correctly-programmed microprocessor to understand which movements it should initiate.

“One most provocative and controversial question,” Nicolelis said, “is whether the brain can actually incorporate a machine as part of its representation of the body. I truly believe that it is possible. The brain is continuously learning and adapting, and previous studies have shown that the body representation in the brain is dynamic. So if you created a closed feedback loop in which the brain controls a device and the device provides feedback to the brain, I would predict that as people or animals learn to use the device their brains will basically dedicate neuronal space to represent that device.”

Nicolelis intends to begin examining that question by providing feedback to experimental animals both visually and as pressure on their skin.

He has also proposed creating two types of dedicated neurochips. One, called the data acquisition neurochip, would be used to acquire, filter and digitize data from as many as 1,000 cortical motor neurons. The second neurochip in the chain would integrate signals from all the neurons and pass the information to a microprocessor, which would translate the information into robot trajectories.

Through the mediation of a computer, or some kind of computational chip, it looks as if the race is on for people to control machines just by thinking about them. Telekinesis, that staple of science fiction, may be just around the corner in the wireless world.