Lies, damned lies, and technology: the accuracy of the polygraph lie detector has been attacked. Lawrence Farwell (above, right) thinks measuring the burst of brain activity that occurs when a person recognizes something (left) could provide a reliable alternative. Others fear the technique works better in the lab than in the field. Credit: BRAINWAVESCIENCE

When talk-show hosts and death-row inmates come knocking on his door, neuroscientist Lawrence Farwell welcomes both with open arms. Farwell is the inventor of a brain fingerprinting machine, and since he began consulting in high-profile murder cases four years ago, he has found no shortage of publicity for his technique. In March alone, Farwell featured in a Public Broadcasting special and appeared on ABC's Good Morning America and CNN. “They come to us,” says his public-affairs manager. “All I do is sit here and field phone calls.”

Farwell's true quest is for credibility, not publicity. Brain fingerprinting is far from being an accepted forensic tool, despite claims on Farwell's website that it was used to trap a serial killer into admitting his guilt and to help free a man jailed for 25 years for a murder he says he did not commit.

But while the media rush to book Farwell, some scientists say his claims are overblown. Farwell says brain fingerprinting can determine with near certainty whether specific information is stored in a person's brain. The critics say the technique, like most others used to sort truth from lies, suffers from a lack of peer-reviewed evidence that it works. “The necessary research has never been done,” says Emanuel Donchin, a psychophysiologist at the University of South Florida in Tampa and Farwell's former graduate adviser.

Yet these unproven technologies are becoming increasingly attractive to US law-enforcement and security agencies. The traditional tool of lie detection, the polygraph, has come under withering criticism in the past decade. With growing national security concerns, the hunt for alternatives has intensified. Laboratory tools — from infrared sensors to eye trackers — are being converted into lie detectors to help catch criminals and spot terrorists before they strike.

Sweaty palms

“We are no longer just looking for specific acts of deception,” says Andrew Ryan, director of research at the Department of Defense Polygraph Institute (DoDPI) in Fort Jackson, South Carolina. “We are interested in the overall credibility of people. It's a much broader field.”

The credibility of the polygraph has long been questioned. Conceived in 1915 by psychologist William Marston, who also invented the comic-strip superhero Wonder Woman, the polygraph measures changes in perspiration, breathing, pulse and blood pressure as the subject answers a series of yes-or-no questions. When we lie, the theory goes, changes in these parameters betray us.

Certainly the stress of lying in an interrogation can cause physiological changes, from sweaty palms to a rapid heart rate. But then so do other high-pressure situations, such as being suspected of a crime or being hooked to a machine and asked a lot of questions. As a damning 2003 report on polygraphy by the National Academy of Sciences concluded, there is little scientific evidence that polygraphers can identify dishonest responses with any certainty1.

In response to the academy report, and the concerns of employees, in September 2003 the US nuclear weapons labs reduced the number of routine polygraph tests from 20,000 to 4,500. Other agencies still use the polygraph for security screening and in law enforcement, but rarely as evidence in trials. A 1998 US Supreme Court ruling upheld state restrictions on the use of the polygraph in courtroom proceedings, and today most states ban it altogether.

It is this gap that Farwell hopes brain fingerprinting will fill. The technique does not measure deception, merely whether information is stored in the brain. Farwell's subjects wear standard electrodes for recording brain activity, and are shown a series of words on a computer screen, some of which refer to details of a crime that only the police and someone present at the scene would know. If the murder weapon was a gun, the suspect might see the words Luger, Remington and Uzi. If the suspect knows the murder weapon was a Remington, seeing the word should evoke a ‘brainwave’ that the electrodes can detect.

It sounds far-fetched, but the brainwaves Farwell relies on are well documented. Also known as event-related potentials or ERPs, they measure the electrical activity of myriad neurons in response to a stimulus, such as a word or an image. ERPs have been studied for decades, but the cognitive processes underlying them are still unclear.

One bump in the ERP trace, called a P300 because it occurs about 300 milliseconds after an event, generally occurs when the stimulus holds special significance for the subject. In 1991, Farwell and Donchin showed that in a controlled laboratory setting the P300 could be used to reveal hidden knowledge of a mock crime conducted just prior to the test2. Later, Farwell developed a related measure called a MERMER, which he went on to patent3. Four years ago, he set up Brain Fingerprinting Laboratories, a company now based in Seattle, to commercialize the technology for both civilian and national-security uses.

Brainwaves

But critics charge that the technology is not ready for use in the real world. Peter Rosenfeld, a psychophysiologist who studies ERPs at Northwestern University in Evanston, Illinois, points out that in the only published field test of ERPs for detecting deception, the machine did little better than guessing4. There are many factors that could affect the outcome. Because the P300 is measuring recognition, whether you remember an event is crucial. For instance, how does memory of a crime change over time, or in response to drugs or stress? Does a psychopath even have a normal P300 response? Such factors need to be properly tested, Rosenfeld says. “It will take a substantial amount of research before this is ready.”

And what about false positives? Say the suspect is innocent, but collects Remingtons. It doesn't matter, claims Farwell, because they test memory of several crime-related items, not just one. William Iacono, a psychologist at the University of Minnesota in Minneapolis, agrees that it is hard to imagine how getting six out of six hits, for instance, could be a false positive: “How do you explain that just by chance?”

Farwell maintains that it is better to use the test and let a judge or jury decide how much weight to give it. “I don't agree that you have to remove every possibility of a false positive or a false negative,” he says. “Court cases get decided every day on 60:40 probabilities. What we bring to the judge and jury is highly accurate information.”

Hot flushes

The P300 response is supposedly beyond the subject's control, but there may be ways to manipulate it. In a newly published study, supported by the DoDPI, Rosenfeld's group exposed several students to a mock crime and then gave them a P300 test. Some students had instructions to imagine their professor slapping them in the face during each stimulus. As a result, two-thirds of them were able to produce P300 waves even for stimuli that should have been negative5. This approach wouldn't clear a suspect who used it, but it would render the test useless.

The security agencies desperately need new tools to help them identify terrorists. Can brain fingerprinting help? Farwell says he has been able to tell FBI agents from civilians by checking for P300s elicited by terms learned during FBI training. So in theory, a P300 test could pick out a terrorist by finding records in the brain of something only a terrorist would know — phrases from an al-Qaeda training manual, for instance. The trick is finding information that no innocent person would know.

One approach the DoDPI is pursuing is an eye tracker, which, like brain fingerprinting, checks for concealed knowledge. The theory is that the eyes involuntarily spend less time scanning familiar objects than unfamiliar ones, a difference measurable in milliseconds. So the eyes of a terrorist will linger less over a photo of the desert camp where he trained than over a photo of a survivalist camp in Nebraska. A prototype will be ready for field testing in the next few months, Ryan says.

Trial by fire? Stress-induced changes in facial bloodflow might betray the guilty.

Another favoured technique is an infrared scanner that looks for subtle changes in capillary blood flow to the face, a sign of stress. In preliminary work published in Nature two years ago6, researchers from the DoDPI and Honeywell Laboratories in Minneapolis asked eight subjects to stab a mannequin, rob it of $20, and then deny it. Twelve controls had no knowledge of the ‘crime’.

An infrared-wielding interrogator spotted 75% of the lying muggers, and only pegged 10% of the honest controls as liars. Similar results were obtained with a traditional polygraph test. Development has continued apace, and a prototype scanner resembling a photo booth will enter its first field test this summer, Ryan says, possibly at an embassy.

The infrared scanner is easier to use than the polygraph and so the potential for its widespread use — and abuse — is higher. Without the air-conditioned booth, which maintains a constant temperature but is not essential, the scanner could be adapted for mass screening. Ryan emphasizes that there are no plans to test the scanners in airports or other US entry points. But the potential to monitor people's faces as they answer questions — “Did you pack that bag yourself?” — makes such a tool very attractive.

High stakes

Moving this technology too quickly from the lab to the field alarms experts such as Stephen Fienberg, a statistician at Carnegie Mellon University in Pittsburgh and head of the academy polygraph panel. In his experience, far too little testing is done on lie detectors. Even the Nature paper overstates the evidence, Fienberg argues, because it omitted ten of the original research subjects without explanation.

Lead author Ioannis Pavlidis, now a computer scientist at the University of Houston, confirms the subjects were excluded, but claims there are technical explanations. For instance, eating lunch just before the test made the recordings hard to interpret. They wrote a paper explaining this, he says, but the Nature format did not have enough space.

Even so, Fienberg and other experts believe researchers should emphasize new technologies' limitations, because people tend to trust machines, sometimes more than their own judgement. “If people believe these devices work, then spies or terrorists who pass through them are deemed to be OK,” Fienberg says. The risk of false incrimination or false security must be taken seriously.

Another key problem with most lie-detection research is a failure to acknowledge the difference between taking part in a laboratory experiment and trying to get away with murder. Paul Ekman, a behavioural psychologist at the University of California, San Francisco, says the stakes involved make an enormous difference to a person's behaviour when lying. But the stakes are low in most experiments.

Nearly everyone agrees that more work is needed: the question is by whom. As the National Academy of Sciences report concluded, government agencies seeking to deploy lie detectors are not likely to be thorough or unbiased, nor are companies out to make a profit. “There needs to be a separation between the mission agencies who use the tools, and the people who do the core research,” Fienberg says.

Ekman is one of the few doing basic research into deception. Supported by the National Institutes of Health, he has spent 30 years looking for behavioural cues that give away deception. He has found that a subtle pause in speaking or a shift in posture may indicate lying. It's not a Pinocchio's nose, he says — “but in demeanour there are signs that you are not getting a full account.”

Ekman simulates high stakes for his subjects by promising $100 or more to those who successfully deceive an interviewer and a punishment of a few hours in a dark room with sudden loud sounds if they are caught lying. Under such conditions, he has reported that trained observers can spot lies about 80% of the time7.

Mind reading: functional MRI scans reveal brain regions that light up only during a lie. Credit: D. LANGLEBEN

Ekman's latest project asks whether even stronger motivation can mask deception. With funding from the Defense Advanced Research Projects Agency, he is testing his approach on extremists involved in animal rights or stopping abortion. “We are not interested in the moderate ones,” he says. Like terrorists, such people may feel justified in breaking the law and lying about their activities, thus making their lies harder to detect.

While most studies measure deception indirectly, some scientists are starting to look for signs of deceit in the brain. One group, led by Daniel Langleben, a psychiatrist at the University of Pennsylvania, is using functional magnetic resonance imaging (fMRI) to look for brain regions activated by lying. Subjects received envelopes containing a playing card — the five of clubs — and $20. With their heads inside an fMRI scanner, they were told they could keep the money if they managed to deceive a computer about what card they had. Each time they lied about having the five of clubs, two brain regions lit up, but not when they said truthfully that another card was not theirs8.

Human intuition

The work is extremely preliminary, Langleben cautions. Further experiments are under way to rule out alternative explanations. At this stage, fMRI is a research tool, not a way to spot liars.

Meanwhile, the Department of Homeland Security is consulting researchers on several approaches. Among them are ERPs, fMRI and voice-stress analysis, according to Gary Strong, who is in charge of behavioural studies at the department's Science and Technology division. Such tools may ultimately aid airport security and border control agents, he says. “The urgency is certainly there,” Ryan agrees. “All of these technologies were in development before 9/11, but now we have an added mission.”

Technology may be at the forefront of the push for new ways to detect deception, but a machine has yet to be developed that makes the human interviewer redundant. Ekman and his colleagues have developed a week-long training course in analytical interviewing which has seen a surge in interest among governmental counterintelligence groups. He has also begun to train embassy officials in the Foreign Service.

New methods will certainly come into use, both in the courts and in the ports. Whether justice and security will gain depends on whether there is sufficient evidence for their reliability. Without proper research, the question remains: who's fooling whom?