What's going on in Taejon, at the Korea Research Institute, is a very basic example of what could be the most interactive technology of the future: brain-computer interfaces. Early computers were controlled by cardboard punch cards; the first PCs demanded typed DOS commands; the mouse gave us a graphic interface. Will we one day be able to enter the world of computing with no external mechanical intermediary whatsoever in other words, just by thinking? Researchers around the globe are working on the problem. The Joint Research Centre of the European Commission, for example, has developed the Adaptive Brain Interface, a helmet and software program (like the one in Korea) intended to allow disabled people to operate appliances using thought commands. At the British government's Defense Evaluation and Research Agency, in Farnborough, the same techniques are helping fighter pilots fly jets with their minds. But the place where brains and computers are truly coming together is in the lab of Miguel Nicolelis, associate professor of neurobiology at the Duke University Medical Center in North Carolina. He has trained two owl monkeys to control a robotic arm via brain signals giving glimpses of how the virtual and physical worlds may merge.
Working with colleagues at Duke, M.I.T.'s Laboratory for Human and Machine Haptics (also known as the Touch Lab) and the State University of New York Health Science Center, Nicolelis implanted electrodes into the sections of the monkeys' brains in which the planning and execution of arm movements takes place. When the brain instructs the body to make a motion, it fires off electric signals well before any action actually takes place; in other words, the body lags slightly behind the brain's intention to act. In effect, the brain warms up for an impending movement by directing specific clusters of neurons to fire, just as you might warm up your car's engine by pumping the gas pedal.
Nicolelis and his colleagues monitored the monkeys' brain signals as they warmed up for various tasks, like reaching for food, and isolated the signals that preceded the movements. Then they routed the monkeys' brain signals through a computer. As a monkey started to grasp for food, the computer picked up the neural traffic and forwarded it to a robotic arm called the Phantom. When the monkey extended its arm, the Phantom, using the neural signals from the monkey, precisely mimicked the action. Nicolelis even transmitted the brain signals over the Internet to the Touch Lab in Cambridge, Massachusetts, so the monkey's neural commands operated another Phantom 965 km away.
Nicolelis is convinced that this system will work for humans an interface that might allow paralyzed people, who generally have healthy limbs that they are unable to use due to spinal cord damage (which prevents brain signals from reaching their limbs), to control their own biological limbs. It could also give people extended senses, allowing them to have virtual limbs in cyberspace or robotic limbs in the physical world. "The brain knows that it has an arm and a hand because it is connected to these things and gets feedback from them," Nicolelis says. "The same could be true for robotic or virtual appendages. If you control a remote hand that senses objects and sends tactile sensations back to your brain, it behaves as if it's your own hand. It becomes part of you. Your body becomes extended beyond the surface of your skin."
What Nicolelis is describing is a reverse phantom limb. Instead of continuing to feel the presence of a limb that is no longer there, people equipped with a brain-computer interface could operate new appendages, and the brain would eventually come to regard these as its own. But what could a person do with a remote robotic or virtual limb? The possibilities range from the mundane to the otherworldly. In the virtual realm, these appendages would dispense with the bulky technology of conventional haptics and allow Web shoppers to squeeze a peach online to see if it's ripe. Video conferences and chats might start with actual handshakes. And of course, there's sex. Consenting adults could use the technology to engage in far more intimate embraces and manipulations. In the realm of robotics, devices could be sent to dangerous or inhospitable climes, like deep-sea thermal vents or the craters of active volcanos.
For the human brain to truly incorporate prosthetics into its body map requires feedback: the brain will only become aware of its new limbs if they make their presence known. To see how the monkeys might respond to this kind of anatomical extension, Nicolelis is creating a feedback loop between the monkeys and the robotic arm. In the next experiments the monkeys will have sensors attached to their bodies, so that the robotic arm delivers tactile sensations directly to their skin. When the monkey's brain waves impel the robotic arm to grasp a piece of fruit, for example, the animal will be able to feel the fruit's texture. The monkeys will also be able to watch the robotic arm in action on a computer screen. This kind of tactile and visual feedback, Nicolelis hopes, will teach the monkeys to associate the arm's movements with their thoughts. Once they make that link, they might not take the trouble to stretch out their arms anymore. Why bother when a mere thought will move the robot arm?
So far, there is no way to tap into the brain without dramatically invasive surgery, so human experimentation is unlikely. And there's an intriguing risk in the realm of brain-computer interfaces. What would happen if the process was reversed? The signals that are routed from the monkey's brain through the computer to control the robotic arm could be sent back to the monkey to control its behavior. Implants in humans would face strong opposition unless the possibility of this kind of mind control could be eliminated, which so far seems impossible to achieve.
Nicolelis is confident that a technological breakthrough will come, perhaps in the form of some kind of permanent intracranial implants, and that the ethical issues surrounding the technology will be resolved. It will probably be a long time before our brains will merge with our computers. When that day does come, however, our bodies will still be ourselves but they could well have more than just two arms and two legs.
© James Geary 2001. This is an edited excerpt from Geary's book The Body Electric, which will be published by Weidenfeld & Nicolson in the fall