(4 of 4)
The biggest problems, however, may be ethical and constitutional. For now, improved lie detection is likely to have broad public support. But what about when it reaches more surreptitiously into our lives? Biophysicist Britton Chance of the University of Pennsylvania has explored ways to use infrared light projected from a distance to penetrate the skull, looking for signs of stress similar to the ones fMRIs detect. Both that and remote periorbital thermography could be used undetectably in airport lines to spot high-stress passengers. Whether that stress is caused by the bomb you're concealing or the fact you're running late can't be known until you're pulled from line, searched and interrogated.
Several groups have raised questions about the new technologies. The American Civil Liberties Union filed Freedom of Information requests in June, seeking to learn more about lie-detection research the government is conducting and whether the techniques are already being used in the field. This fall a leading--but as yet undisclosed--science journal will publish the results of a paper it solicited from Stanford's Greely and other legal experts and scientists exploring the ethics of lie detection. The authors are not expected to smile unreservedly on the science or on the way they believe it may already be in use--perhaps, according to some reports, in Iraq. Frank has helped train people in facial analysis, but he will say only that some of them have been sent to work in "regions of interest."
Private companies like No Lie MRI face legal hurdles too. So young a technology has almost no chance of clearing the admissibility bar in criminal cases, limiting its value to potential customers in law enforcement. And the Employee Polygraph Protection Act of 1988, which restricts the circumstances under which current or prospective employers may use existing lie-detection technology, will probably apply to fMRIs too.
For now, the new lie-detection techniques are likely to remain in the same ambiguous ethical holding area as so many other privacy issues in the twitchy post-9/11 years. We'll give up a lot to keep our cities, airplanes and children safe. But it's hard to say in the abstract when "a lot" becomes "too much." We can only hope that we'll recognize it when it happens.