Most of us don't give movement a second thought. If we see an object we want to pick up, we pick it up. If we see something we want to move, we simply move it. But for people who have suffered injuries to their spinal cord, the chain of command between eyes, brain and limbs can be broken. They can see what they want to move, but their brain's instructions cannot reach their limbs.

John Donoghue, a neuroscientist at Brown University in Providence, Rhode Island, and his colleagues have devoted their time to understanding how the brain turns thoughts into actions. Working initially with monkeys, the team has studied the activity of brain cells during simple tasks such as using a computer mouse to move a cursor on a screen.

On page 164 of this issue, the researchers take their work one step further, showing how an implant, known as a brain–computer interface, can bridge the gap between brain and movement for a quadriplegic patient.

In their earlier studies, the team had shown that brain signals recorded when a monkey moved a computer mouse could be decoded and then used directly to control the cursor, eliminating the need for the monkey to move its hand. The big question was whether a similar set-up would work in people.

The first task was to adapt the sensor and decoder system for use in humans. “It required close to 30 people working for more than a year on all the necessary regulatory processing and technology development to get the system ready for this pilot study,” says Donoghue.

In June 2004, Donoghue and his team began working with a quadriplegic man who had suffered a spinal-cord injury three years earlier. They implanted an interface device into the patient's brain, which would act as a sensor of nerve-cell activity and, ultimately, as a transmitter of these signals to a computer.

They sought to obtain the appropriate brain signals by asking the man to follow a moving cursor with his eyes while imagining that his hand was actually causing that movement. As they had hoped, the sensor detected signals from brain cells that changed their activity as the patient imagined controlling the cursor.

By repeating the process several times the researchers were able to pinpoint the pattern of neuronal activity that was associated with a particular movement of the cursor. They then translated that signal into computer language.

“We were monitoring signals from a few dozen neurons rather than the few million that are usually active during hand movement,” says Donoghue. But despite the small sample size, the experiment worked. Once the brain signals had been decoded, the patient could think about moving the cursor and the brain–computer interface translated this into an instruction that moved the cursor — albeit without the same accuracy or speed as an actual movement. The patient was also able to use the same process to open and close a prosthetic hand and to exert some control over a robotic arm.

Donoghue and his team are encouraged by the results of their pilot study and have now enrolled more patients to test the idea further and to refine their device.