Neuroscientists usually scan people's brains looking for tumors or aneurysms or to localize the extent of physical trauma. But in a series of experiments performed at New York University a few years ago, scientists went looking for racism. When they showed subjects pictures of unfamiliar white and black faces and scanned their brains with functional MRI machines, they could see heightened activity in the amygdala, a part of the brain that corresponds with emotional arousal. Moreover, the brain activity matched up with psychological tests designed to measure unconscious racism. "This technology is probably not ready for prime time yet," says University of Pennsylvania neuroscientist Martha Farah, but she can foresee a day when police academies, for example, might scan prospective cadets to weed out racists. "If we could, in fact, define racism," Farah says, "this would be a potentially useful tool--but with very serious issues of privacy and informed consent."
Welcome to the exploding new field of neuroethics, the study of the ethical and philosophical dilemmas provoked by advances in brain science. It's only since a seminal conference in 2002 that the field has even existed; shortly thereafter, Penn and Stanford founded the first academic centers for neuroethics in the country. Last year a multidisciplinary group--including philosophers, lawyers and psychologists--created the Neuroethics Society to explore the issues in a formal way.
Just in time. As brain science becomes increasingly sophisticated, the moral and legal quandaries it poses threaten to proliferate into every part of our lives. And as the racism experiment makes clear, brain imaging has already started to do so. Even in their current state, brain scans may be able to reveal, without our consent, hidden things about who we are and what we think and feel. "I don't have a problem with looking into your brain," says Alan Leshner, former director of the National Institute on Drug Abuse and current head of the American Association for the Advancement of Science. "But I'm not so sure I want you looking into mine."
These technologies may become an intimate part of our lives sooner than we think. "It's not so futuristic," says Stanford neuropsychologist Judy Illes, "to imagine an employer able to test for who is a good team player, who a leader or a follower." Before such scans are used, neuroethicists warn, we must understand what they can and cannot do. A device that might be helpful in personnel testing, for example, might not be rigorous enough to be used in a criminal trial, where the standard of proof is higher. That's currently the case with the polygraph. But Farah is afraid that because of the high-tech aura of brain scans, people may put more faith in them than is warranted.
Perhaps even more critical is the question of who should be allowed to peek into our brains. Employers? Schools? The government? The answers are far from clear. Employers, for example, already give psychological tests to job applicants, and schools test 3- and 4-year-olds to anticipate reading problems. Brain scans may actually give better results. But brain scans are also much more powerful and far more invasive, and the law is murky on whether they can be performed without our consent. We may feel instinctively that we have a right to brain privacy, but feelings have no legal standing.