Bank Account Sort Code Finder

Neuroscience, mental privacy, and the law.

III. MIND READING WITH NEUROIMAGING: WHAT WE CAN (AND CANNOT) DO

Having established in Part II a working definition of neuroimaging
mind reading, the Article now briefly discusses several recent legal
applications of such technology. Part III reviews: (A) fMRI-based lie
detection; (B) fMRI-based memory detection; (C) EEG-based memory
detection; and (D) fMRI-based decoding and reconstruction of visual
stimuli.

A. Lie Detection with fMRI (127)

Neurons, the cells of greatest interest in the brain and nervous
system, need oxygen to live. This oxygen is supplied to them via blood
flow. fMRI is premised on the logic that tracking relative blood flow to
different parts of the brain will reveal relative oxygen uptake, and
thus show which neurons are more active (at a given moment in time).
(128) Changes in blood oxygen levels in the brain at different moments
during a given experimental task allow for inferences about
brain-activation patterns.

Different protocols have been used in fMRI lie detection, most of
which rely on a paradigm known as the “Concealed Information
Test” (CIT) (also known as the “Guilty Knowledge Test”
(GKT)). (129) This paradigm is different than the Control Question Test
typically used by professional polygraphers. (130)

fMRI lie detection evidence has been proffered in several U.S.
cases, has been the topic of much neuroscience research, and has drawn
the attention of many commentators. (131) There are a large number of
conceptual and technical problems with this approach. Conceptually, one
major challenge with neuroimaging lie detection is defining a
“lie.” (132) In practice, neuroscience lie detection has
utilized an “instructed lie” experimental paradigm, (133) in
which subjects are told to lie under certain conditions in the
experiment. Critics point out that this may limit the inferences we can
make about “lying,” because an instructed lie in the lab may
not involve the same brain activity as a high-stakes lie in real life
outside the lab. (134) Additionally, technical issues include general
concerns about using fMRI techniques to study higher-order cognitive
functions. (135)

Of particular note here is the “reverse inference”
fallacy. The reverse inference fallacy is the idea that just because a
particular part of the brain is more active during a certain cognitive
state, it does not necessarily follow that whenever that brain area is
more active, a person is in that cognitive state. (136) The reverse
inference fallacy is acute in the lie detection case, as “it is not
lying per se that is being decoded from these brain areas but rather the
cognitive and emotional processes that are associated with lying.”
(137)

Despite these limitations, two for-profit fMRI-based lie detection
companies are now in operation, (138) and both have proffered evidence
in criminal trials on behalf of defendants. (139) So far, the evidence
has been ruled inadmissible under both the Daubert standard in federal
court (140) and the Frye standard in state court. (141) However, the
judge overseeing the evidentiary hearing in the federal case suggested
that such evidence may one day become admissible:

   [I]n the future, should fMRI-based lie detection undergo further
   testing, development, and peer review, improve upon standards
   controlling the technique's operation, and gain acceptance by the
   scientific community for use in the real world, this methodology
   may be found to be admissible even if the error rate is not able to
   be quantified in a real world setting. (142)

For purposes of the Fourth Amendment and Fifth Amendment analysis
in Part IV, it is important to note that all of these experimental
paradigms involve researcher-subject interaction such as requesting a
response to a visual stimulus or question. (143) Although fMRI may be
used in what is known as “resting state” analyses (in which
the subject just lies in the scanner), such resting-state approaches
have not been employed in the lie detection context. (144)

B. Memory Detection with fMRI

Scientists have also made intriguing progress in detecting
memories. Neuroscientists Jesse Rissman and Anthony Wagner were able to
use fMRI, combined with an advanced data analysis methodology, (145) to
identify with great accuracy the subjective memory states of subjects,
such as whether the subject thought he had seen a particular face
before. (146) Subjects were initially shown a battery of faces, and
then, while in the scanner, were shown both previously seen and new
faces. (147) The researchers could tell with great accuracy whether a
subject remembered seeing a particular face. (148) Further, “neural
signatures associated with subjective memory states were sufficiently
consistent across individuals to allow one participant’s mnemonic
experiences to be decoded using a classifier trained exclusively on
brain data from other participants.” (149)

At the same time, the researchers were much more limited in their
ability to determine from brain signals alone whether a subject had
actually seen the face before–the subject’s objective memory
state. The distinction between subjective and objective memory states,
as the authors noted, has very important legal implications. (150) The
law generally is interested in objective memory states, such as whether
a witness actually saw the alleged criminal.

As with fMRI-based lie detection, current memory-detection
techniques with fMRI require both subject-researcher interaction (such
as pressing a button to indicate when a face is remembered).

C. Memory Recognition with EEG

Distinct from the fMRI-based approaches just described are
memory-recognition approaches using electroencephalography (EEG). These
techniques are not lie detection per se, though they are typically used
to improve assessment of an individual’s veracity. For instance, if
a defendant’s alibi is that he was never at the scene of the crime,
an EEG memory-recognition test could theoretically help the fact finder
or investigator’s assessment of the defendant’s credibility.
(151)

As discussed earlier, EEG is a method of measuring the electrical
activity produced by the brain. (152) Electrodes are placed on the
subject’s scalp, and electrical activity is recorded. (153) As with
fMRI lie detection studies, EEG memory-recognition paradigms use a
version of the Concealed Information Test. (154) The logic is that the
brain will react differently to a stimulus (such as a photo of a
particular aspect of a crime scene) if that person recognizes the
stimulus. (155)

A measurement of electrical activity called the P300 wave
specifically is of note. (156) “The P300 is a special ERP
[event-related potential] component that results whenever a meaningful
piece of information is rarely presented among a random series of more
frequently presented, non-meaningful stimuli often of the same category
as the meaningful stimulus.” (157) The theory is that if a series
of objects are shown to a subject, the brain will automatically respond
in a different way to items that have been seen before and are thus
recognizable. Starting in the 1980s, research confirmed this to be the
case. “The P300 would not represent a lie per se but only a
recognition of a familiar item of information, the verbal denial of
which would then imply deception.” (158)

In a recognition task with EEG, subjects are exposed to three types
of stimuli: probes (the stimuli that only the guilty party would know);
irrelevant stimuli (the stimuli that have nothing to do with the crime
scene); and target stimuli (the stimuli that are related to the crime
scene but that everyone knows). (159) The legal system has seen a
particular version of this approach–the “brain
fingerprinting” approach developed by scientist Lawrence Farwell.
(160) Farwell presented his evidence in two cases, (161) but his
approach has not been admitted into evidence since, and has been heavily
criticized. (162)

There are many scientific challenges to the brain fingerprinting
approach. As some critics have pointed out:

   [T]here is no simple one-to-one relationship between the P300 and
   memory. Even though information stored in memory may very well
   cause some events to be identified as distinct and therefore elicit
   a P300, reducing the P300 to a simple "Aha!" response driven by
   'recognition of the relevant information contained in the probes as
   significant in the context of the crime' is quite at variance with
   what is known about the P300. (163)

Moreover, “laboratory research on brain fingerprinting
published in peer-reviewed journals amounts to a single study containing
20 participants.” (164)

Setting aside the scientific shortcomings, and thus its
admissibility on the merits, two features of the brain fingerprinting
approach are particularly relevant to subsequent legal analysis
discussed in the next Part. Like the fMRI studies just reviewed, every
brain fingerprinting study conducted to date requires substantial
subject-researcher interaction. Here, the researcher instructs the
subject to press a button to indicate that an image is recognized. As
Farwell writes, “A subject neither lies nor tells the truth during
a brain fingerprinting test. He simply observes the stimuli and pushes
the buttons as instructed.” (165)

Farwell’s claim is that “[b]rain [f]ingerprinting testing
has nothing to do with lie detection. Rather, it detects information
stored in the brain by measuring brain responses.” (166) The
critical word is responses, as the testing relies on the
researcher’s questions and the subject’s response. Even if the
subject’s response was not required via pushing a button, it is
difficult to see how the protocol could work without requiring the
subject to look at the screen in front of him or her. Farwell is clear
that the protocol must “[r]equire an overt behavioral task that
requires the subject to recognize and process every stimulus,
specifically including the probe stimuli.” (167)

In addition to response during the testing itself, the brain
fingerprinting procedure requires subject cooperation before the test.
Before the EEG is run, “the subject is interviewed to find out what
he knows about the crime from any non-incriminating source such as news
reports or prior interrogations.” (168) In short, then, brain
fingerprinting is machine-assisted neuroimaging mind reading that
requires as a precondition the subject’s mental cooperation.

D. Decoding of Visual Stimuli with fMRI

Perhaps the most dazzling developments in mind reading with
neuroimaging have come from research labs decoding and reconstructing
visual stimuli. Over a decade ago, researchers knew enough about the way
the brain functions to determine, on the basis of brain activity alone,
whether subjects were thinking about a famous face or a place. (169)
Researchers can also tell, solely on the basis of brain activity, which
of two researcher-selected verbs subjects are viewing. (170) These
findings may at first seem to be precisely the type of method that
should properly invoke great concerns about mental privacy. This
interpretation is, however, unwarranted.

fMRI, as discussed earlier, “detects changes in hemodynamic
(literally ‘blood movement’) properties of the brain as a
subject engages in specific mental tasks.” (171) Oxygen binds to a
protein called hemoglobin in the blood, becoming oxyhemoglobin, and when
that oxygen is used by neurons in the brain it releases from the
hemoglobin. (172) Without the oxygen attached, the oxyhemoglobin becomes
deoxyhemoglobin, fMRI takes advantage of the “fact that
oxyhemoglobin is not magnetic but deoxyhemoglobin is magnetic. This
difference is the basis of fMRI because the scanner can detect regions
with relatively more oxyhemoglobin (as a consequence of demands from
neural activity). This signal is called the blood-oxygenation
level-dependent (BOLD) signal.” (173)

The BOLD response is divided into small three-dimensional areas
roughly the size of a pea, about three cubic centimeters. These small
cubes are called voxels (which stands for volumetric pixels) and contain
between five hundred thousand and one million neurons. (174)

When researchers investigate the neural correlates of some activity
carried out in the scanner, they use an “encoding” approach.
They are looking to see how the brain encodes the presented stimuli.
(175) By contrast, in a “decoding” approach, researchers start
with the measured brain activity and then attempt to predict some
information external to the brain, such as the visual stimulus that
might have generated the observed brain activation pattern. (176)

Theoretically all cognitive states–from “seeing red” to
“feeling love for your mom”–are encoded by neurons, whose
activity can be mapped using voxel encoding. (177) Decoding, in
contrast, is “a model that uses voxel activity to predict sensory,
cognitive, or motor information.” (178) One type of decoding is
reconstruction, in which “patterns of activity are used to produce
a replica of the stimulus or task.” (179) A study using this type
of decoding, from the lab of Berkeley neuroscientist Jack Gallant, made
headlines in 2011 when the researchers reconstructed images of film
clips that subjects viewed while in an fMRI scanner. (180)

It is not hard to imagine potential legal applications of such
technology. However, Gallant’s lab cautions that “[t]he
potential use of this technology in the legal system is
questionable,” because “[a]ny brain-reading device that aims
to decode stored memories will inevitably be limited not only by the
technology itself, but also by the quality of the stored information.
After all, an accurate read-out of a faulty memory only provides
misleading information.” (181) Thus, “any future application
of this technology in the legal system will have to be approached with
extreme caution.” (182)

How should this type of research be categorized in the mind reading
and brain reading typology? Neuroscientists Frank Tong and Michael
Pratte provide us with two useful illustrations:

[(1)] A participant is brought into a neuroimaging lab and asked to
lie back comfortably on a padded bed table, which is slowly glided into
a brain scanner. The participant watches a brightly colored display as
it provides a virtual tour of every painting in the Musee d’Orsay.
All the while, noninvasive measures of that person’s brain activity
are discretely taken, and the arrays of numbers are quickly transferred
to the memory banks of a high-speed digital computer. After hours of
brain scanning and computer analysis, the real scientific test begins. A
randomly drawn painting is shown again to the observer. The computer
analyzes the incoming patterns of brain activity from the
participant’s visual cortex and makes the following prediction with
99% confidence: She is looking at painting #1023, Cezanne’s Still
Life with Apples and Oranges. The experimenter turns to look at the
computer screen, and indeed, the participant is looking at a plateful of
pastel-colored red and yellow apples, and ripe oranges stacked in a
porcelain bowl, all carefully arranged in the thick folds of a tousled
white tablecloth. Another randomly drawn picture is shown, and the
computer correctly predicts Landscape with Green Trees by Maurice Denis.

[(2)] [In a separate experiment, the lab volunteer] is shown two
paintings in quick succession (Bedroom in Aries, The White Horse) and
then is asked to pick one and hold that image in mind for several
seconds. She imagines a horse standing in a shallow river, head bent low
as if looking at its own reflection in the slowly flowing stream. The
computer quickly scans the matrix of numbers streaming in. Although
brain activity levels are substantially weaker as she gazes steadily at
the blank screen, compared to moments ago, a pattern begins to emerge
from her visual cortex. The computer announces, with 85% confidence,
that the participant is imagining the second painting, The White Horse.
(183)

Both of these scenarios, as the researchers go on to describe at
length, are “more fact than fiction,” (184) because
neuroscience research has been able to accomplish both, and will only
get better over time. (185) Consequently, they argue, “mental
privacy could face enormous new challenges, in both legal settings and
beyond, as there has been no precedent for being able to look into the
mind of another human being.” (186)

Both of the above examples should be categorized as
machine-assisted mind reading with neuroimaging. But Tong and Pratte
apply a different standard, labeling the first scenario as brain reading
and not mind reading because “the experimenter does not need a mind
reading device to achieve this performance. The same result could be
achieved by simply looking over the participant’s shoulder….
” (187) Tong and Pratte view the second scenario, however, as mind
reading because “information that is fundamentally private and
subjective is being decoded from the person’s brain; the only
alternative would be to ask the participant directly about what she is
thinking and to hope for an honest reply.” (188)

Their categorization is thus based on the relative value-added of
the machine-produced brain information. It is true that the brain data
from the second example is more valuable than that from the first
example. But whether of high-value or low-value, this is (a) brain data
(b) generated by neuroimaging methods and (c) being used to infer a
mental state. Thus, as discussed in Part IV, they should be considered
the same for purposes of Fourth and Fifth Amendment analyses.

The “mind reading” label is appropriate here because the
reconstruction is an inference about the mental state of “what the
individual is perceiving visually.” But this view is not shared by
neuroscientist Jack Gallant, who, when interviewed about his own
experiments, argued that “[w]e’re not doing mind-reading here.
We’re not really peering into your brain and reconstructing
pictures in your head. We’re reading your brain activity and using
that brain activity to reconstruct what you saw. And those are two very,
very different things.” (189)

Gallant is correct that collecting the brain data (and then using
it to reconstruct an image) is not the same as truly reconstructing
pictures in one’s head. But it seems plausible, and probably very
likely, that the reconstruction is being generated to provide meaningful
information about the subject’s mental experience–that is, what
the picture in the subject’s head was. Put another way, there is an
inferential chain that connects the (observable) reconstruction with the
(unobservable) actual mental experience of the visual stimuli. So long
as what is being observed is intended to tell us something about the
unobserved, it ought to be categorized as mind reading.

IV. PROTECTING MENTAL PRIVACY: THE FOURTH AND FIFTH AMENDMENTS

The preceding Parts of this Article defined mind reading and brain
reading, and reviewed some of the emerging technologies. Part IV now
considers the Fourth Amendment and Fifth Amendment constitutional
protections available as a privilege against admissibility of
involuntary government use of such techniques. This Part argues that, at
least with the technology that presently exists and is likely to develop
in the near future, both privileges should be readily available.

A. “Scholars Scorecard” on Mental Privacy

Although the mind reading techniques just reviewed are relatively
new to the scientific scene, a number of law professors, law students,
and other scholars are already on record with their predictions about
Fourth and Fifth Amendment protections.

Debates about the rationales underlying these two privileges are
vigorous in legal scholarship. (190) This Article does not weigh the
relative merits of the theoretical approaches, but rather examines
whether, as a practical matter, commentators come to similar conclusions
about this evidence. Table 2 summarizes these results in a
“Scholars Scorecard.” (191) Although there are many doctrinal
paths taken, and some notable exceptions, scholars typically find that
the Fourth Amendment and the Fifth Amendment both provide protections
against mind reading techniques with neuroimaging.

Some scholars see the potential for protection under these
Amendments only if the jurisprudence is reconceptualized. Law professor
Nita Farahany has most recently and most persuasively put forth such an
argument. Professor Farahany suggests that, at present, “[m]ental
privacy is not sacrosanct under either the Fourth or Fifth Amendment,
which provide procedural safeguards but not substantive ones to
adequately protect mental privacy.” (192) Professor Farahany
suggests innovative ways to shore up these safeguards. (193)

But such innovations may not be required. If the analysis is
restricted to technology that is either presently available or likely
available in the near future, and if mind reading is defined as in
previous Parts, then both the Fourth and the Fifth Amendment
constitutional questions are easily resolved even using the conventional
approach. (194)

In an early and influential article on the right to privacy, Samuel
Warren and Louis Brandeis observed that “the common law secures to
each individual the right of determining, ordinarily, to what extent his
thoughts, sentiments, and emotions shall be communicated to
others.” (195) Animated by this spirit, both Amendments can serve
as adequate protection in their defined spheres against
government-compelled mind reading with neuroimaging.

B. Fourth Amendment

The Fourth Amendment states, in relevant part, “The right of
the people to be secure in their persons, houses, papers, and effects,
against unreasonable searches and seizures, shall not be violated, and
no Warrants shall issue, but upon probable cause….” (233) To
decide whether Fourth Amendment protection should be afforded to
evidence from a warrantless search, the Supreme Court applies the
two-step “reasonable expectation of privacy” test articulated
by Justice Harlan in his concurrence in Katz v. United States (234) and
first adopted by the Court majority in Smith v. Maryland. (235) This
test seeks to determine whether the party invoking the Fourth
Amendment’s protection can claim to have had a
“‘reasonable … expectation of privacy’ that [was]
invaded by government action.” (236) Under the first step of the
test, the Court asks “whether the individual, by his conduct, has
‘exhibited an actual (subjective) expectation of privacy.”
(237) Given that the party had such a subjective expectation, the Court
then asks “whether … the individual’s expectation, viewed
objectively, is ‘justifiable’ under the circumstances.”
(238) In applying the second prong of this test, the Court typically
balances the importance of the individual privacy interests affected
against the government’s interest in effective investigation and
prosecution. (239)

Legal scholar Michael Pardo is correct when he argues that
“[a]nalysis under the Fourth Amendment of compelled neuroscience
tests is fairly straightforward.” (240) If one has a reasonable
expectation of privacy in one’s blood and urine, surely one has a
reasonable expectation of privacy in one’s brain cells. (241)

It is true that some of the methods, such as EEG, are non-invasive.
But as legal scholar Amanda Pustilink suggests:

   Kyllo's thermal imaging of heat from the home makes an excellent
   analogy with EEG-detection of brain waves that emanate from the
   mind. Electrical brain waves, like thermal signatures from an
   occupied home, are automatically and continuously produced as long
   as a person is alive and a home is not abandoned.... [T]he need
   for decoding does not make the raw information itself unprotected
   by the Fourth Amendment. (242)

The Fourth Amendment may allow such a search if undertaken pursuant
to a valid warrant. (243) Thus, a neuroimaging mind reading test
“could be compelled if the government has probable cause and a
warrant, or a recognized exception to these requirements.” (244)
One exception is a grand jury investigation, (245) where the court will
decide whether to allow the search through a balancing test. (246) In
cases involving mind reading evidence, the court will need to weigh
“the state’s need for the information” against “the
witness’s–and society’s–interest in the mental and
neurological privacy of the common citizen.” (247) Because the
court will consider the value of the information to the state, it should
assess the value added by the neuroscience data to the state’s
investigation. This requires a close examination of the methods being
used and the inferential chain that will connect the brain data to the
mental state conclusions. Compared to the blood test at issue in
Schmerber, (248) which clearly provided relevant data about blood
alcohol level in the individual, neuroimaging data may not provide
sufficiently relevant information about the mental state in question.
(249) This would be a fact-specific inquiry, but surely would be
informed by the likely unsettled nature of the neuroscience. (250)

One alternative, proposed by legal scholar Nita Farahany, draws on
intellectual property law to inform Fourth Amendment analysis. (251)
Arguing that “[s]ecrecy is a far more important privacy interest
than seclusion in the information age,” (252) Farahany suggests
that “in the not-too-distant future, the government might have the
ability to imperceptibly and noninvasively obtain information directly
from a suspect’s brain.” (253) If such a future comes to pass,
this Article’s Fourth Amendment analysis will need revision. But
such a future is not inevitable, short of the advent of a new, currently
unforeseen technology. Current fMRI and EEG techniques are nowhere near
as imperceptible as intercepting radio signals or tapping into
electrical wires. With fMRI, the subject must lie very still in a noisy
environment, with his head inside a small enclosure. An MRI machine can
reach noise levels of 130 decibels (by comparison, a jackhammer is 95
decibels and sandblasting is estimated to be 125 decibels). (254) EEG
protocols require the careful placement of electrodes on one’s
scalp to record electrical activity. (255) Both EEG and fMRI thus remain
very different experiences than the far more unobtrusive experience of
government hacking into your wireless Internet network while you surf
the Web.

C. Fifth Amendment

The Fifth Amendment states, in relevant part: “No person …
shall be compelled in any criminal case to be a witness against
himself.” (256) Fifth Amendment jurisprudence is complex. (257) In
analyzing questions that have been raised under the Fifth Amendment, the
Supreme Court has considered issues such as:

   (1) who is protected by the privilege; (2) what constitutes
   "compulsion"; (3) what type of compelled evidence is subject to the
   protection of the privilege; (4) what uses of that evidence are
   barred in a criminal case; (5) when and how may a protected person
   exercise the privilege; and (6) what governmentally imposed burdens
   to impair the exercise of the privilege as [sic] to be
   unconstitutional. (258)

Slicing through this complexity, analysis of neuroimaging mind
reading has returned, in virtually every instance, to a question of
characterization: Is the neuroimaging evidence to be considered
physical, like blood or breath, or testimonial, like speech or a written
diary entry? (259) If it is physical, so the logic goes, then it is not
privileged. If it is testimonial, it is. (260)

Many scholars, most recently Professors Farahany and Pustilnik,
have rightly criticized this dichotomy. (261) Dov Fox has also laid out
an alternative path forward, arguing that:

   The physical/testimonial distinction underlying self-incrimination
   doctrine is unlikely to protect a criminal suspect from the
   compelled use of fMRI or EEG. Yet this key distinction presupposes
   a flawed conception of mind/body dualism. Brain imaging techniques
   that deprive individuals of control over their thoughts violate the
   "spirit and history of the Fifth Amendment." (262)

There is good reason to review and revise this doctrine, but even
without scholars’ suggested innovations, the distinction between
the physical and the testimonial, though not perfect, can be readily
applied to new neuroimaging technologies. In the typology developed
here, physical evidence can be thought of as brain reading with
neuroimaging, and testimonial evidence as mind reading with
neuroimaging.

The touchstone in previous scholarly analyses is often Schmerber v.
California, (263) and with good reason. In Schmerber, the defendant was
driving while intoxicated, and after crashing his car, was admitted to
the hospital. (264) In the hospital, and at the instruction of a police
officer, a doctor took a sample of the defendant’s blood. (265) The
blood was analyzed and found to contain a sufficient level of alcohol to
suggest intoxication, and the analytic report was introduced at trial.
(266) The defendant argued that this was a violation of his Fifth
Amendment privilege, but the Court disagreed and laid out its
influential distinction between “‘communications’ or
‘testimony'” and “‘real or physical'”
evidence. (267) The Court noted that the Fifth Amendment “offers no
protection against compulsion to submit to fingerprinting,
photographing, or measurements, to write or speak for identification, to
appear in court, to stand, to assume a stance, to walk, or to make a
particular gesture.” (268) As one state court observed, “No
volition–that is, no act of willing–on the part of the mind of the
defendant is required” for these types of evidence. (269) Rather
“[t]he physical facts speak for themselves; no fears, no hopes, no
will of the prisoner to falsify or to exaggerate could produce or create
a resemblance of her finger prints or change them in one line, and
therefore there is no danger of error being committed or untruth
told.” (270)

One way to apply this rule is to argue that all brain data is
physical (and not at all testimonial). As Hank Greely and Anthony Wagner
remind us, “An fMRI scan is nothing more than a computer record of
radio waves emitted by molecules in the brain. It does not seem like
‘testimony.'” (271) An EEG might also be described as
just a computer record of electrical waves.

But recall that the methods by which the neuroimaging data is
collected require subject response to questions. This places such
techniques squarely in the testimonial category. (272) Moreover, even if
the subject is not required to answer questions as part of the research
task, if the recorded brain data is subsequently used to make an
inference about a mental state, then this use of the data should be seen
as making it testimonial, because it is eliciting from the (fMRI or EEG
measures of) physical brain tissue information about a mental state. If
the brain data remain silent (that is, they are not analyzed), they
communicate nothing about the subject’s mental state (and are
similarly irrelevant for introduction as evidence in later criminal
proceedings). But when the intent is to analyze the data, and to make an
inference about the subject’s mind, it crosses the line into mind
reading (and thus testimonial evidence).

Professor Pustilnik suggests an additional way in which the case
law might be understood by pointing to the comparatively recent case of
Pennsylvania v. Muniz, (273) in which the Supreme Court seemed to
recognize an interest in mental privacy itself. (274) In Muniz, a
defendant was asked a series of questions in custody without receiving a
Miranda warning. (275) The purpose of these questions was to gauge the
defendant’s sobriety. (276) The Commonwealth argued in its brief
that there was no Fifth Amendment privilege because “the inference
concerns ‘the physiological functioning of [Muniz’s]
brain'” and the brain is real, physical evidence. (277) The
Court responded that “the question is not whether a suspect’s
‘impaired mental faculties’ can fairly be characterized as an
aspect of his physiology, but rather whether Muniz’s response to
the sixth birthday question that gave rise to the inference of such an
impairment was testimonial in nature.” (278) The Court found that
“the privilege is [properly] asserted to spare the accused … from
having to share his thoughts and beliefs with the Government.”
(279) Professor Pustilnik argues that Muniz suggests “a distinction
between two sets of physical signs: (1) the nonprivileged set of
physical signs that does not reveal evoked mental contents and (2) the
privileged set of physical signs that does.” (280) This distinction
maps nicely on to the distinction this Article makes between brain
reading (which should not be privileged) and mind reading (which should
be privileged).

When courts have contemplated mind reading devices, it seems that
they too would find a way to offer these constitutional protections to
defendants. For instance, consider this court’s musing in dicta:

   [S]hould lie detector and computer technology advance to permit an
   analysis of brainwave function and physiological effects to reflect
   thought, a machine might be developed to read the mind....
   [B]ecause the existence of one's thoughts, one's cognitive process,
   is a "foregone conclusion," the contents of any thoughts one
   voluntarily creates could be used against one in a criminal
   proceeding under [a strict interpretation of the] the Fisher-Doe
   analysis. (281)

This, the court suggested, would be a problematic interpretation,
as it would make Fifth Amendment protections redundant with those
already provided by the Fifth and Fourteenth Amendments’ Due
Process Clauses. (282) “If the Fifth Amendment is to stand for our
constitutional preference for an accusatorial system,” the court
argued, “it must protect the divulgence of the contents of
one’s mind, one’s thought processes, when those testimonial
divulgences–be they oral or written communications-would
self-incriminate.” (283)

Not all agree that existing doctrine is sufficient. Professor
Farahany, for example, argues that “[n]either ‘physical’
nor ‘testimonial’ accurately describes neurological evidence.
Neuroscience involves noninvasive testing of the physical brain to gain
evidence that has physical form. Just as nodding the head can
communicate a response, so too can neurological changes in the
brain.” (284) Because “[m]ental privacy is not sacrosanct
under either the Fourth or Fifth Amendment, which provide procedural
safeguards but not substantive ones to adequately protect mental
privacy,” (285) Farahany suggests that mental privacy may be at
risk in the future:

   A society interested in robust cognitive freedom would likely wish
   to protect its citizens from unwarranted detection of automatic,
   memorialized, and uttered evidence in the brain. That current
   self-incrimination doctrine is unlikely to do so should give us
   pause. Private thoughts, private memories, and undisclosed ideas in
   the mind help to define our sense of autonomy and inviolate
   personality. A sphere of private rumination is essential to our
   fundamental concepts of freedom of thought, freedom of expression,
   freedom of will and individual autonomy. Whether or not we preserve
   that sphere may come to define us as a society as emerging
   neuroscience begins to take hold. And yet none of our constitutional
   doctrines currently contemplate or afford adequate
   protection against such intrusions. (286)

Professor Farahany’s logic has appeal. If we care primarily
about seclusion, then our analysis should focus on the physical
intrusiveness of the search, and as neuroscience methods allow for
increasingly less intrusive searches, they might increasingly be
allowed. (287) Less intrusive brain searches may well lead to more brain
data in the hands of the government. But making use of this brain data
to reliably infer mental states will require a significantly expanded
neuroscientific knowledge base about the brain-mind relationship. It is
not clear that such expansion will occur in the near future.

V. ADDITIONAL THREATS TO MENTAL PRIVACY

There may be only limited threats to mental privacy from
government-compelled neuroimaging mind reading techniques in the
criminal law arenas governed by the Fourth and Fifth Amendments. This
does not mean, however, that there are no real threats to mental
privacy; it simply means they are likely to appear in other places. This
Part briefly discusses several areas of concern. (288)

A. Competency and Parole

Brain-based methods to assess competency may pose a significant
threat to individuals’ mental privacy. (289) Although defendants
typically raise the competence question, it can also be raised by the
state. (290) The government may file a motion to determine the
defendant’s competency. (291) As a result, “the court may
order that a psychiatric or psychological examination of the defendant
be conducted, and that a psychiatric or psychological report be filed
with the court.” (292) The standard for whether a defendant has
competency to proceed is “whether he has sufficient present ability
to consult with his lawyer with a reasonable degree of rational
understanding and whether he has a rational as well as factual
understanding of the proceedings against him.” (293) Courts focus
specifically on the defendant’s competency to stand trial, not
competence more generally. (294) Although it is not the norm, modern
neuroimaging techniques are now supplementing competency evaluations in
some cases. (295)

The uncertainty of brain-mind-behavior linkages becomes important
in this context. Because the cases concern the defendant’s
demonstrated behavioral abilities to stand trial, “the existence of
anatomical abnormalities [in a defendant’s brain] is irrelevant in
itself, provided that the court determines that the defendant displays
sufficient cognitive capabilities to fulfill the Dusky
[requirements].” (296) As a policy matter, the question is: Does
adding the brain evidence improve our assessment of competence? (297)
But as a matter of mental privacy protection, the question is: What if
the defendant refuses to cooperate with the exam? The Supreme Court has
held that “[a] criminal defendant, who neither initiates a
psychiatric evaluation nor attempts to introduce any psychiatric
evidence, may not be compelled to respond to a psychiatrist if his
statements can be used against him at a capital sentencing
proceeding.” (298) If a defendant does not cooperate, the American
Bar Association’s (ABA) guidelines suggest that the defendant can
be observed for a limited period of time. (299) Perhaps someday this
will include observation via a brain monitoring device.

Turning from the start of proceedings to the end, a mental health
evaluation may be required as a precondition for parole. (300) At
present, such mental health regulations do not explicitly require
neuroimaging evidence as part of the mental health exam, nor do they
explicitly exclude neuroimaging evidence. This might change, and one can
imagine a future parole office where individuals, in addition to
undergoing a drug test, go through a battery of neuroimaging tests.
(301)

B. Police Investigation, Employee Screening, and National Defense

Although the polygraph is inadmissible in most courts, it is used
regularly during police work. (302) And even though the polygraph is
outlawed as a screening device for most private employers through the
Employee Polygraph Protection Act of 1988, (303) there are exceptions.
(304) In short, the polygraph is used extensively despite significant
legal protections against the use of polygraph results in criminal
proceedings and employee hiring. The same may become true of
neuroimaging mind reading if it is thought to aid in criminal
investigation and employee screening.

Indeed, there is evidence that such methods may be useful in
terrorist interrogations, (305) and a U.S. senator asked the General
Accountability Office (GAO) to report on the value of brain
fingerprinting for national security. (306) Whether such mind reading
with neuroimaging is employed in practice will depend on how sensitive
and precise the technology proves to be and on how resistant it is to
countermeasures. (307)

C. Future Developments

Discussions of brain-based mind reading with neuroimaging
necessarily involve predictions about the future. These predictions, as
Murphy and Greely emphasize, are difficult to make. (308) But these
predictions about the future have ramifications for the present because
they focus our limited resources and attention on particular potential
threats to mental privacy. Although it is important to be prepared, it
is also important to stay grounded in what is likely to develop.

What follows is a brief, and tentative, prediction about the future
use of neuroimaging data to make accurate assessments of higher order
cognition in legally relevant ways.

When considering the striking visual reconstructions from fMRI
data, one should find solace in the observation that “[t]he visual
cortex is relatively easy to read compared to other parts of the brain
that work together and influence our private thoughts.” (309) Thus,
impressive results about visual reconstruction may not be soon followed
by reconstructions of memories and higher-level cognition. (310)

That said, there are important developments to note. First, the
mobility of brain reading technology is improving rapidly. (311) Second,
functional near-infrared imaging (fNIRI) is now being explored as an
additional way to monitor brain function. (312) This approach
“capitalizes on the absorption and scattering properties of
near-infrared light to provide information about brain activity.”
(313) There is significant value in such a technology: “For the
psychiatric researcher, these additional strengths can bring otherwise
previously unthinkable projects into the realm of possibility.”
(314) A significant restriction is that it can only monitor the cortex.
(315) But for some applications, that may not be an obstacle. (316)
Third, researchers are improving their experimental techniques. A
scientific review in 2008 asked the question: “Brain Imaging and
Brain Privacy: A Realistic Concern?” (317) After reviewing sixteen
published fMRI studies relevant to the question of distinguishing
individual differences in psychological traits, the authors concluded
that “a modest degree of brainotyping capability already exists.
The potential use of functional brain imaging to gain knowledge of
someone’s psychological traits is not science fiction, but rather a
realistic possibility, albeit limited in important ways.” (318)
Such a conclusion would seem to suggest that we are on the verge of a
mental privacy crisis, especially when combined with even a modest
belief in scientific progress. But is this so? Is a mental privacy panic
justified?

Neuroscientist Emily Murphy and legal scholar Hank Greely have
argued that “both the excitement about and the fear of the
consequences of mind reading are too extreme.” (319) It is tree
that, despite major technical barriers preventing detailed mind reading,
the law might still be interested in using “good enough”
science. (320) As such, we need continued and careful monitoring of
developments in neurolaw and mental privacy. But we need not fall into
full-scale panic. Putting an MRI scanner in the police station will not
trample on our mental freedoms because the complexity of the mind-brain
relationship will prevent the government from using the brain data to
reliably read individual minds.

Just how complex is the brain? It has been called “perhaps the
most complex entity known to science.” (321) Nobel laureate Roger
Sperry has commented that “the centermost processes of the brain
with which consciousness is presumably associated are simply not
understood. They are so far beyond our comprehension that no one I know
of has been able to imagine their nature.” (322) MIT neuroscientist
Sebastian Seung encourages us to think about a map of the brain (which
he calls the “connectome’) as similar to the “Where We
Fly” maps used by airlines: “What you need to imagine is that
every city is a neuron, and every flight between cities is a
connection,” except that “in our brains we would need to start
with a hundred billion cities and thousands of flights per city.”
(323) Understanding this map, and even the relevant sub parts of it,
will take an extremely long time. Even with rapid advances in
neuroscience, it seems unlikely that in this generation or the next we
will uncover enough about the brain to do the sort of mind reading
imagined by Philip K. Dick. (324)

This is not to say that we will not learn much that has clinical
applications; indeed, we already have. And this is not to say that
neuroscience will not change law and policy; again, it already has. This
is to say that as amazing as neuroscience is, it remains science and not
science fiction. (325)

CONCLUSION

This Article has argued that the legal system is readily equipped
to provide citizens with adequate protection against
government-compelled or coerced mind reading with neuroimaging. The law
has seen, and protected citizens from, previous analogs, and the
technology itself is unlikely to be as dangerous as some prognosticators
believe. We should certainly be concerned about the government tracking
our minds, but we should be most concerned about government carrying out
that tracking by observing and inferring mental states from our
behavior, not our brains.

FRANCIS X. SHEN, Associate Professor, University of Minnesota Law
School; Executive Director of Education and Outreach, MacArthur
Foundation Research Network on Law and Neuroscience. For helpful
research assistance, the author thanks John Frost, Martha Kramer, Alissa
Mitchell, and Jason Reed. For helpful comments, the author thanks Adam
Steiner. The author notes that this work is Ad Maiorem Dei Gloriam.

(1.) Long Beach City Emps. Ass’n. v. City of Long Beach, 719
P.2d 660, 663 (Cal. 1986).

(2.) JAMIE WARD, THE STUDENT’S GUIDE TO COGNITIVE NEUROSCIENCE
49 (2d ed. 2010).

(3.) Nikos K. Logothetis, What we can do and what we cannot do with
fMRI, 453 NATURE 869, 869 (2008).

(4.) See infra Part I.A.

(5.) See infra Part II.A.

(6.) I use the term “machine aided neuroimaging”
throughout to distinguish this technology from other forms of mind
reading, including mind reading that may use technology other than
neuroimaging. See infra Part II.B. The Article focuses primarily on fMRI
and EEG methods, but employs a broad conceptualization of
“neuroimaging” to include fMRI, EEG, QEEG, MEG, PET, SPECT,
and other related techniques. To improve readability, the Article at
times shortens the phrase “machine-aided neuroimaging” to
simply “neuroimaging.”

(7.) Numb3rs (CBS television series Jan. 23, 2005 to Mar. 12,
2010).

(8.) SALT (Columbia Pictures 2010).

(9.) MINORITY REPORT (20th Century Fox 2002).

(10.) Minority Report, IMDB, http://www.imdb.com/title/tt0181689/
(last visited Feb. 6, 2013).

(11.) See MINORITY REPORT, supra note 9. The failure of this
technology gives rise to the plot tensions in the movie.

(12.) INCEPTION (Warner Bros. Pictures 2010).

(13.) Mara Boundy, Note, The Government Can Read Your Mind: Can the
Constitution Stop It?, 63 HASTINGS L.J. 1627, 1628 (2012).

(14.) See infra Part I.

(15.) It should be emphasized that excellent scholars have
carefully examined the topic free from any such panic elements. See,
e.g., I KNOW WHAT YOU’RE THINKING: BRAIN IMAGING AND MENTAL PRIVACY
(Sarah D. Richmond et al. eds., 2012); see also Nita A. Farahany,
Incriminating Thoughts, 64 STAN. L. REV. 351, 406 (2012) ]hereinafter
Farahany, Incriminating Thoughts]; Nita A. Farahany, Searching Secrets,
160 U. PA. L. REV. 1239 (2012) [hereinafter Farahany, Searching
Secrets].

(16.) Borrowed from DOUGLAS ADAMS, THE HITCHHIKER’S GUIDE TO
THE GALAXY 50 (Random House 1997) (1979).

(17.) See Emily R. Murphy & Henry T Greely, What Will Be the
Limits of Neuroscience-Based Mind Reading in the Law?, in OXFORD
HANDBOOK OF NEUROETHICS 642 (Judy Illes & Barbara J. Sahakian eds.,
2011) (“The law reads minds all the time, though not through
technical means.”). More precisely, we all become mind readers as
we develop cognitively, unless we suffer a developmental challenge such
as autism. MICHAEL S. GAZZANIGA & TODD HEATHERTON, PSYCHOLOGICAL
SCIENCE 452-53 (2d ed. 2006). Whether this is a unique human trait is
the topic of much debate. See MICHAEL S. GAZZANIGA, HUMAN: THE SCIENCE
BEHIND WHAT MAKES US UNIQUE 49-54 (2008).

(18.) Sarah Richmond, Introduction to I KNOW WHAT YOU’RE
THINKING, supra note 15, at 1, 3.

(19.) GAZZANIGA, supra note 17, at 49.

(20.) See generally Francis X. Shen et al., Sorting Guilty Minds,
86 N.Y.U.L. REV. 1306 (2011).

(21.) Sharon Begley, Mind Reading Is Now Possible: A Computer Can
Tell with 78 Percent Accuracy When Someone Is Thinking About a Hammer
and Not Pliers, NEWSWEEK, Jan. 12, 2008, at 22, available at
http://www.thedailybeast.com/
newsweek/2008/01/12/mind-reading-is-now-possible.print.html.

(22.) Stephen J. Morse, Avoiding Irrational NeuroLaw Exuberance: A
Plea for Neuromodesty, 62 MERCER L. REV. 837 (2011).

(23.) “[W]e are aware of the mental states of our fellow human
beings on the basis of what they do and say” and the origins of
these “traditional forms of mind-reading … predate the beginnings
of recorded history.” Tim Baynes, How to Read Minds, in I KNOW WHAT
YOU’RE THINKING, supra note 15, at 41, 41.

(24.) See generally OTHER MINDS: HOW HUMANS BRIDGE THE DIVIDE
BETWEEN SELF AND OTHERS (Bertram F. Malle & Sara D. Hodges eds.,
2005) (providing a number of different disciplinary perspectives on how
we do mind reading). As scholars have pointed out, and as everyday
experience confirms, “Whether we are sizing someone up or seducing
him or her, assigning blame or extending our trust, we are very nearly
always performing the ordinary magic of mindreading.” Daniel R.
Ames, Everyday Solutions to the Problem of Other Minds: Which Tools are
Useful When?, in id., at 158, 158. Defined in this way, we can see that
airline employees are mind reading when they assess your answer to the
question, “Did you pack your own bags?”; that public school
officials are mind reading when they assess a tardy student’s
answer to the question, “Why were you late to school?”; and
that a highway patrolman is mind reading when he asks a driver,
“Have you been drinking tonight?” In the latter two examples,
the government may require additional information to further assess the
mental state. For instance, a doctor’s note will help the principal
assess whether or not the student is lying about the reason for being
late, and the odor of alcohol (or lack thereof) will help a patrolman
assess whether the driver is being honest. In all of these instances, a
government official is trying to assess the mental state of a citizen,
and thus is engaging in “mind reading” in the sense the term
is used in this Article.

(25.) For a lengthier and more comprehensive introduction to
neurolaw, see, for example, A PRIMER ON CRIMINAL LAW AND NEUROSCIENCE
(Stephen J. Morse & Adina L. Roskies eds., forthcoming May 2013); 13
CURRENT LEGAL ISSUES, LAW AND NEUROSCIENCE (Michael Freeman et al. eds.,
2011); LAW, MIND AND BRAIN (Michael Freeman & Oliver R. Goodenough
eds., 2009); NEUROIMAGING IN FORENSIC PSYCHIATRY: FROM THE CLINIC TO THE
COURTROOM (Joseph R. Simpson ed., 2012); Henry T. Greely & Anthony
D. Wagner, Reference Guide on Neuroscience, in FED. JUDICIAL CTR. ET
AL., REFERENCE MANUAL ON SCIENTIFIC EVIDENCE (3d ed. 2011); Owen D.
Jones & Francis X. Shen, Law and Neuroscience in the United States,
in INTERNATIONAL NEUROLAW: A COMPARATIVE ANALYSIS 349 (Tade Spranger
ed., 2012); Teneille Brown & Emily Murphy, Through A Scanner Darkly:
Functional Neuroimaging as Evidence of a Criminal Defendant’s Past
Mental States, 62 STAN. L. REV. 1119 (2010); Stacey A. Tovino,
Functional Neuroimaging and the Law: Trends and Directions for Future
Scholarship, 7 AM. J. BIOETHICS 44 (2007).

(26.) See Jones & Shen, supra note 25, at 374 (citing Nita A.
Farahany, An Empirical Study of Brains and Genes in U.S. Criminal Law
(2011) (unpublished manuscript) (on file with Vanderbilt University Law
School)).

(27.) See, e.g., Francis X. Shen, Neurolegislation & Juvenile
Justice, 46 LOY. L.A.L. REV. (forthcoming 2013).

(28.) See Francis X. Shen, The Law and Neuroscience Bibliography:
Navigating the Emerging Field of Neurolaw, 38 INT’L J. LEGAL INFO.
352 (2010).

(29.) OWEN D. JONES, JEFFREY D. SCHALL, & FRANCIS X. SHEN, LAW
AND NEUROSCIENCE (forthcoming 2013).

(30.) Id.

(31.) See, e.g., Education and Outreach, MACARTHUR FOUND. RESEARCH
NETWORK ON LAW & NEUROSCIENCE, http://www.lawneuro.org/outreach.php
(last visited Jan. 27, 2013).

(32.) See, e.g., LAW AND NEUROSCIENCE BLOG,
http://lawneuro.org/blog/(last visited Jan. 27, 2013).

(33.) See Amy Wolf, Landmark law and neuroscience network expands
at Vanderbilt, VANDERBILT UNIV. (Aug. 24, 2011),
http://news.vanderbilt.edu/2011/08/grant-willexpand-law-neuroscience-network/. See generally MACARTHUR FOUND. RESEARCH NETWORK ON LAW &
NEUROSCIENCE, www.lawneuro.org (last visited Jan. 27, 2013).

(34.) See, e.g., Joshua Greene & Jonathan Cohen, For the Law,
Neuroscience Changes Nothing and Everything, 359 PHIL. TRANSACTIONS
ROYAL SOC’Y: BIOLOGICAL SCI. 1775 (2004).

(35.) See, e.g., Michael S. Pardo & Dennis Patterson,
Philosophical Foundations of Law and Neuroscience, 2010 U. ILL. L. REV.
1211 (2010); Adam J. Kolber, Paper Presented at Rutgers School of
Law-Camden Law and Neuroscience Conference: Will There Be a Neurolaw
Revolution? (Sept. 7-8, 2012), available at
http://lawandphil.rutgers.edu/sites/lawandphil.rutgers.edu/files/kolber.pdf.

(36.) See, e.g., Eyal Aharoni et al., Can Neurological Evidence
Help Courts Assess Criminal Responsibility? Lessons from Law and
Neuroscience, 1124 ANNALS N.Y. ACAD. SCI. 145 (2008); Shelley Batts,
Brain Lesions and Their Implications in Criminal Responsibility, 27
BEHAV. SCI. & L. 261 (2009); Theodore Y. Blumoff, The
Neuropsychology of Justifications and Excuses: Some Problematic Cases of
Self-Defense, Duress, and Provocation, 50 JURIMETRICS J. 391 (2010);
Nita A. Farahany & James E. Coleman, Jr., Genetics, Neuroscience,
and Criminal Responsibility, in THE IMPACT OF BEHAVIORAL SCIENCES ON
CRIMINAL LAW 183 (Nita A. Farahany ed., 2009); David Eagleman, The Brain
on Trial, ATLANTIC, July-Aug. 2011, at 112; DEBORAH W. DENNO, CHANGING
LAW’S MIND: HOW NEUROSCIENCE CAN HELP US PUNISH CRIMINALS MORE
FAIRLY AND EFFECTIVELY (forthcoming n.d.).

(37.) See CONSCIOUS WILL AND RESPONSIBILITY: A TRIBUTE TO BENJAMIN
LIBET (Walter Sinnott-Armstrong & Lynn Nadel eds., 2010).

(38.) See THE OXFORD HANDBOOK OF NEUROETHICS (Judy Illes &
Barbara J. Sahakian eds., 2011).

(39.) See Henry T. Greely, Prediction, Litigation, Privacy, and
Property: Some Possible Legal and Social Implications of Advances in
Neuroscience, in NEUROSCIENCE AND THE LAW: BRAIN, MIND, AND THE SCALES
OF JUSTICE 114 (Brent Garland ed., 2004); Adam J. Kolber, The
Experiential Future of the Law, 60 EMORY L.J. 585 (2011).

(40.) Robert P. Granacher, Jr., Traumatic Brain Injury, in
NEUROIMAGING IN FORENSIC PSYCHIATRY, supra note 25, at 44.

(41.) Susan E. Rushing, Daniel A. Pryma, & Daniel D. Langleben,
PET and SPECT, in NEUROIMAGING IN FORENSIC PSYCHIATRY, supra note 25, at
3, 20-21.

(42.) Kolber, supra note 35, at 16.

(43.) Id. at 16-28.

(44.) See 34 AM. JUR. 3D Proof of Facts [section] 1 (2012); id.
[section] 363; Donald J. Nolan & Tressa A. Pankovits, High-Tech
Proof in Brain Injury Cases, TRIAL, June 2005, at 26 (2005).

(45.) W.M. Moldoff, Annotation, Admissibility in Civil Action of
Electroencephalogram, Electrocardiogram, or Other Record Made by
Instrument Used in Medical Test, or of Report Based upon Such Test, 66
A.L.R.2D 536 (1959).

(46.) 8 AM. JUR. 3D Proof of Facts 145 [section] 1 (1990)
(“The escalating use and development of CT since the 1970s has made
it a well-established technique.”).

(47.) Nathan J. Kolla & Jonathan D. Brodie, Application of
Neuroimaging in Relationship to Competence to Stand Trial and Insanity,
in NEUROIMAGING IN FORENSIC PSYCHIATRY, supra note 25, at 147, 147-48.

(48.) Judith G. Edersheim, Rebecca Weintraub Brendel, & Bruce
H. Price, Neuroimaging, Diminished Capacity and Mitigation, in
NEUROIMAGING IN FORENSIC PSYCHIATRY, supra note 25 at 163-64 (Joseph R.
Simpson ed., 2012).

(49.) See, e.g., Sexton v. State, 997 So. 2d 1073, 1082-85 (Fla.
2008) (counsel’s decision to rely on brain scan evidence to prove
statutory mitigation was reasonable in case involving defendant with
“history of bizarre sexual and criminal behavior”).

(50.) 20 C.F.R. ch. III, pt. 404, subpt. P, app. 1 [section] 12.02
(2012).

(51.) 3 SOC. SEC. LAW & PRAC. [section] 42:147 n.1 (“In
some cases, the origin of the dysfunction is readily identified with
diagnostic tools such as computed tomography (CAT) scanning of the
brain, magnetic resonance imaging (MRI) of the brain, or
electroencephalography (EEG) which reveals the electrical brain wave
patterns.”).

(52.) See, e.g., Kliber v. Soc. Sec. Admin., 794 F. Supp. 2d 1025,
1030, 1034 (D. Minn. 2011).

(53.) See, e.g., Jones & Shen, supra note 25, at 354.

(54.) Id. at 305.

(55.) Susan M. Wolf, Eva B. Stensvad, & Grace Deason, How Is
Neuroscience Used in Criminal Court? Analysis of Judicial Decisions
1994-2009 32, tbl.2 (Dec. 29, 2010) (unpublished manuscript) (on file
with author).

(56.) Adam J. Kolber, Pain Detection and the Privacy of Subjective
Experience, 33 AM. J.L. & MED. 433, 454-55 (2007).

(57.) See Jones & Shen, supra note 25, at 350-51; see also
JONES ET AL., supra note 29.

(58.) An examination of this overlooked history is useful for at
least two reasons. First, it shows us the several ways by which law may
reconcile its need for immediate decisionmaking with the uncertainty
inherent in probabilistic neuroscience data. Second, this history allows
us to trace how instances of new brain discoveries became codified law
just a few decades later.

(59.) S.J.M. Smith, EEG in the Diagnosis, Classification, and
Management of Patients with Epilepsy, 76 J. NEUROLOGY NEUROSURGERY &
PSYCHIATRY ii2, ii2 (2005).

(60.) See id.

(61.) See, e.g., W. Grey Walter, Electro-Encephalography in the
Study of Epilepsy, 85 BRIT. J. PSYCHOL. 932, 933 (1939).

(62.) Epilepsy, PUBMED HEALTH,
http://www.ncbi.nih.nih.gov/pubmedhealth/ PMH0001714/(last visited Jan.
31, 2013).

(63.) See Irwin N. Perr, Epilepsy and the Law, 128 J. NERVOUS &
MENTAL DISEASE 262, 265 (1959) (“The electroencephalogram has
become an increasingly important tool in evaluating and understanding
epilepsy. It has also become quite useful in court cases for the same
reasons [sic].”).

(64.) See, e.g., id. (further noting that a factor in EEG’s
popularity “is that electroencephalography is supposedly an
objective procedure, something which will be proof of something, and the
lawyer–often with a penchant for oversimplification-is prone to look
upon the EEG as a definitive authority. This has led to situations where
the EEG has been grossly misused and subsequently maligned.”).

(65.) Id. (internal quotation marks omitted). The law used to treat
epileptics much differently than it does today. See Kathryn Kramer,
Shifting and Seizing: A Call to Reform Ohio’s Outdated Restrictions
on Drivers with Epilepsy, 22 J.L. & HEALTH 343, 351-52 (2008)
(“Until the 1950s, individuals with epilepsy were legally denied
the right to marry, the right to drive a car, and the right to obtain
employment. Some were even subjected to involuntary sterilization to
preclude reproduction. It was not until 1982 that the last state
repealed its law precluding individuals with epilepsy from
marrying.”) (footnotes omitted).

(66.) Irwin N. Perr, Epilepsy and the Law, 7 CLEV.-MARSHALL L. REV.
280, 287 (1958).

(67.) Id.

(68.) Pub. L. No. 96-265, 94 Stat. 441 (1980).

(69.) See id. pmbl. & tits. II-III, [section][section] 201-311;
Chronology, SOC. SEC. ADMIN., http://www.ssa.gov/history/1980.html (last
visited Jan. 31, 2013).

(70.) Federal Old Age, Survivors, and Disability Insurance
Benefits; Supplemental Security Income for the Aged, Blind, and
Disabled, 45 Fed. Reg. 55,566, 55,608 (Aug. 20, 1980) (codified at 20
C.F.R. pts. 404, 416 (2012)).

(71.) Id. (emphasis added).

(72.) See, e.g., Deuter v. Schweiker, 568 F. Supp. 1414, 1417 (N.D.
Ill. 1983).

(73.) Bradley v. Bowen, 660 F. Supp. 276, 280 (W.D. Ark. 1987).

(74.) Id. at 281.

(75.) See Technical Revisions to Medical Criteria for
Determinations of Disability, 65 Fed. Reg. 6,929 (Feb. 11, 2000) (to be
codified at 20 C.F.R. pts. 404, 416 (2012)) (“We … propose to
remove the requirement for electroencephalogram (EEG) evidence to
support the existence of epilepsy throughout the neurological listings
with the exception of cases involving nonconvulsive epilepsy in
children. This is the only category of epilepsy in which an EEG is the
definitive diagnostic tool; in all other situations of epilepsy, it is
rare for an EEG to confirm the presence of a seizure disorder.”).

(76.) See Technical Revisions to Medical Criteria for
Determinations of Disability, 67 Fed. Reg. 20,018 (Apr. 24, 2002)
(codified at 20 C.F.R. pt. 404 (2012)).

(77.) Id. at 20,019 (emphasis added).

(78.) See, e.g., Salerno v. Astrue, No. 10 C 2582, 2011 WL 6318716,
at *10 (N.D. Ill. Dec. 16, 2011) (“In sum, given the unknown
etiology of Plaintiff’s seizure activity, the lack of MRI and CT
abnormalities is not unexpected. If some of Plaintiff’s seizures
were not epileptic in nature, the MRI and CT tests would be
normal.”); Rebrook v. Astrue, No. 1:09CV50, 2010 WL 2233672, at *18
(N.D.W. Va. May 14, 2010) (“[T]here is absolutely no requirement or
even mention of positive EEG’s, CT scans or MRI’s in the
revised listings.”), adopted by No. 1:09CV50, 2010 WL 2292668
(N.D.W. Va. June 3, 2010). More generally, courts have emphasized that
an administrative law judge may not substitute his judgment for that of
a trained physician, as would occur in the scenario where such a judge
barred a disability claim for epilepsy because of a negative EEG
finding, notwithstanding a physician’s diagnosis that the claimant
had the condition. See, e.g., Rohan v. Chater, 98 F.3d 966, 970 (7th
Cir. 1996) (“[A]s this Court has counseled on many occasions, ALJs
must not succumb to the temptation to play doctor and make their own
independent medical findings.”). Epilepsy advocacy groups commonly
remind epileptics that a normal EEG does not rule out the condition.
See, e.g., What if It’s Normal?, EPILEPSY THERAPY PROJECT,
http://www.epilepsy.com/EPILEPSY/ EEG_NORMAL (last visited Jan. 31,
2013).

(79.) State v. Allen, 241 P.3d 1045, 1064 (Mont. 2010) (footnote
omitted).

(80.) Steven Poole, Your Brain on Pseudoscience: The Rise of
Popular Neurobollocks, NEW STATESMAN, Sept. 6, 2012,
http://www.newstatesman.com/print/188850.

(81.) Others have noted this overreaction as well. See, e.g.,
Daniel D. Langleben, Dan F.X. Willard, & Jane C. Moriarty, Brain
Imaging of Deception, in NEUROIMAGING IN FORENSIC PSYCHIATRY, supra note
25, at 217, 227 (“Overreactions about the potential moral concerns
over fMRI lie detection stem in part from misrepresentations in the lay
and popular press, which have described the technology more as a
‘mind-reading’ technique than a method of discrimination
between two rather simple behaviors…. Though mind-reading with fMRI is
no longer completely in the realm of science-fiction, it is
significantly more complex and less developed than fMRI-based lie
detection….”) (citations omitted).

(82.) Boundy, supra note 13, at 1643.

(83.) Matthew B. Holloway, One Image, One Thousand Incriminating
Words: Images of Brain Activity and the Privilege against
Self-Incrimination, 27 TEMP. J. SCI. TECH. & ENVTL. L. 141, 143
(2008).

(84.) William Federspiel, 1984 Arrives: Thought(Crime), Technology,
and the Constitution, 16 WM. & MARY BILL RTS. J. 865, 900 (2008).

(85.) Jay Stanley, High-Tech “Mind Readers” Are Latest
Effort to Detect Lies, ACLU (Aug. 29, 2012, 11:41 AM),
http://www.aclu.org/blog/technology-and-liberty/hightech-mind-readers-are-latest-effort- detect-lies.

(86.) CHRISTIANS AGAINST MENTAL SLAVERY, http://www.slavery.org.uk/
(last visited Jan. 31, 2013).

(87.) Celia Gorman, The Mind-Reading Machine: Veritas Scientific is
developing an EEG helmet that may invade the privacy of the mind, IEEE
SPECTRUM, July 2012,
http://spectrum.ieee.org/biomedical/diagnostics/the-mind
reading-machine.

(88.) Id. (internal quotation marks omitted).

(89.) Id. (internal quotation marks omitted).

(90.) Id. (internal quotation marks omitted).

(91.) Sec How Technology May Soon “Read” Your Mind, CBS
NEWS, http.//www.cbsnews.com/stories/1998/07/08/60minutes/main4694713.shtml?tag=c bsnewsSidebarArea.0 (last visited Jan. 31, 2013).

(92.) Id.

(93.) Id.

(94.) Id.

(95.) Id. (internal quotation marks omitted).

(96.) Id. (internal quotation marks omitted).

(97.) Id. (internal quotation marks omitted).

(98.) User comments on the CBS website included: “Very
invasive technology that destroys what is left of the 4th
amendment” and “the legal system is behind the times when it
comes to advancements in technology.” Comments to Mind Reading, CBS
NEWS, http://www.cbsnews.com/8601-500251_162-5119805.html?
assetTypeId=58 (hast visited Feb. 6, 2013).

(99.) But as Stoller and Wolpe suggest:

   [O]ur everyday conception of humanity still reflects dualistic
   notions of body and non-physical mind or soul. When we say things
   like "my brain," we implicate a metaphysical being exerting
   influence over the workings of the brain, which we consider to be
   the organ of the mind and consciousness, but not synonymous with
   them. Even neuroscientists and their studies often "seem to leave
   room for the homunculus, the little ghost in the machine, which
   does all the directing of brain traffic."

Sarah E. Stoller & Paul Root Wolpe, Emerging Neurotechnologies
For Lie Detection and the Fifth Amendment, 33 AM. J.L. & MED. 359,
369 (2007) (footnote omitted) (quoting BRENT GARLAND, NEUROSCIENCE AND
THE LAW: BRAIN, MIND, AND THE SCALES OF JUSTICE 66 (2004)).

(100.) Pardo & Patterson, supra note 35, at 1218.

(101.) The question of the mind-brain (and behavior) relationship
is so complex, and has been discussed in such great quantity, that I can
only begin to scratch the surface in this Article. For more detailed
discussion, see generally TORIN ALTER & ROBERT J. HOWELL,
CONSCIOUSNESS AND THE MIND-BODY PROBLEM: A READER (2011); WILLIAM G.
LYCAN, MIND AND COGNITION: AN ANTHOLOGY (2d ed. 1999); MATERIALISM AND
THE MIND-BODY PROBLEM (David M. Rosenthal ed., 2d ed. 2000); Howard
Robinson, Dualism, in THE BLACKWELL GUIDE TO PHILOSOPHY OF MIND 85
(Stephen P. Stich and Ted A. Warfield eds., 2003).

(102.) Although the common label is “mind-body” problem,
in fact it boils down to a “mind-brain” problem. WARD, supra
note 2, at 4.

(103.) See D.M. ARMSTRONG, A MATERIALIST THEORY OF THE MIND 6
(Taylor & Francis rev. ed. 2001) (1968).

(104.) See id. at 10.

(105.) There are many types of reductionist approaches. See NANCEY
MURPHY & WARREN S. BROWN, DID MY NEURONS MAKE ME DO IT?:
PHILOSOPHICAL AND NEUROBIOLOGICAL PERSPECTIVES ON MORAL RESPONSIBILITY
AND FREE WILL 47-48 (2007) (distinguishing five different types of
reductionism).

(106.) STEVEN PINKER, HOW THE MIND WORKS 24 (1997).

(107.) Id. at 25. Just because thinking is computation (or
“information processing”) “does not mean that the
computer is a good metaphor for the mind.” Id. at 23.

(108.) READ MONTAGUE, YOUR BRAIN IS (ALMOST) PERFECT: HOW WE MAKE
DECISIONS 8 (2007). One need not adopt CTOM to see a distinction between
mind and brain. For instance, the “general position accepted by
most if not all neuropsychologists” is one of “emergent
materialism.” J. GRAHAM BEAUMONT, INTRODUCTION TO NEUROPSYCHOLOGY
7-8 (2d ed. 2008). This view accepts the materialist account of the mind
(that is, that the mind is physically instantiated in the brain), but
rejects the claim that the mind can be reduced to a set of physical
states. See id. Rather, adherents to this view prefer the notion of
“emergent properties.” Id. Whether it is emergent materialism,
computational theory of mind, or some other flavor of mind-brain
interaction, the basic point holds: The brain enables the mind (and for
this reason altering the brain, as with drugs, can alter the mind), but
the brain is not equal to the mind. Beaumont likens this to the
sweetness of an apple: “There is nothing in the chemistry or
physical structure of the apple that possesses sweetness. It is the
whole object, in interaction with the eater, that produces the quality
of sweetness.” Id. at 7-8.

(109.) This use of specialized diagnostic tools is not new in
neuropsychology. In 1953, it was already observed in the Texas Law
Review that “[t]he clinical psychologist is often called upon to
infer brain damage from behavior and performances on tests which
evaluate cognitive function.” David B. Vinson, The Use and
Limitations of Clinical Psychology in Studying Alleged Mental Effects of
Head Injury, 31 TEX. L. REV. 820, 820 (1953).

(110.) Bennet I. Omalu et al., Chronic Traumatic Encephalopathy
(CTE) in a National Football League Player: Case Report and Emerging
Medicolegal Practice Questions, 6 J. FORENSIC NURSING 40 (2010).

(111.) Benjamin Holley, It’s All in Your Head:
Neurotechnological Lie Detection and the Fourth and Fifth Amendments, 28
DEV. MENTAL HEALTH L. 1, 20 (2009).

(112.) See, e.g., Owen D. Jones et al., Brain Imaging for Legal
Thinkers: A Guide for the Perplexed, 2009 STAN. TECH. L. REV. 5,
[paragraph][paragraph] 40-41 (2009) (“fMRI brain imaging enables
inferences about the mind, built on inferences about neural activity,
built on the detection of physiological functions believed to be
reliably associated with brain activity.”).

(113.) Stoller & Wolpe, supra note 99, at 372.

(114.) Russell A. Poldrack, The Future of fMRI in Cognitive
Neuroscience, 62 NEUROIMAGE 1216, 1216 (2012).

(115.) HANDBOOK OF CRIME CORRELATES 216 (Lee Ellis et al. eds.,
2009) (quoting J.H. Margerison et al., Electroencephalography, in PETER
H. VENABLES & IRENE MARTIN, A MANUAL OF PSYCHOPHYSIOLOGICAL METHODS
353 (1967)) (internal quotation marks omitted).

(116.) See also Francis X. Shen, Law and Neuroscience:
Possibilities for Prosecutors, 33 CDAA PROSECUTOR’S BRIEF 17, 20-21
(2011), available at http://ssrn.com/abstract=2078639 (noting that
“[t]here is much room for speculation” regarding the proper
conclusions to be drawn from PET scan results, and arguing that
“[t]here has never been, and never will be, a neuroscientific test
that has immediate legal implications free from interpretation.”).

(117.) BEAUMONT, supra note 108, at 4.

(118.) As Beaumont argues:

   Descriptions of brain organization can only be relatively distant
   inferences from the human performance that is actually observed.
   The real states of the brain are not observed. Behavioral measures
   are taken, and by a line of reasoning that is based on background
   information about either the general arrangement of the brain (in
   the case of experimental neuropsychology) or about the gross
   changes in the brain of a particular type of patient (in the case
   of clinical neuropsychology), conclusions are drawn about what the
   correlation must be between brain states and behavior.

Id. at 6-7. Beaumont does note that “[t]he one exception to
this general rule is in electrophysiological studies and studies of
cerebral blood flow and metabolism through advanced scanning techniques,
where actual brain states can be observed, albeit rather crudely, in
‘real time’ alongside the human performance being
measured.” Id. at 7. Although Beaumont states that “[t]his
makes these studies of special importance in neuropsychology,” he
maintains that, “in general, neuropsychological study proceeds only
by inference.” Id.

(119.) As the National Institute of Mental Health makes clear in
its brochure to educate medical consumers:

   No scientific studies to date have shown that a brain scan by
   itself can be used for diagnosing a mental illness or to learn
   about a person's risk for disease.... Brain scans are not usually
   the first test a doctor would do to diagnose changes in mood and
   behavior. Other medical tests a doctor may use include behavioral
   and cognitive tests or a medical interview.

NAT’L INST. OF MENTAL HEALTH, U.S. DEP’T OF HEALTH &
HUMAN SERVS., NEUROIMAGING AND MENTAL ILLNESS: A WINDOW INTO THE BRAIN 3
(2010), available at
http://www.nimh.nih.gov/health/publications/neuroimaging-and-mentalillness-a-window-into-the-brain/neuroimaging- faq.pdf.

(120.) Steven E. Hyman, Can neuroscience be integrated into the
DSM-V?, 8 NATURE REVIEWS NEUROSCIENCE 725, 725 (2007). To be sure,
“[P]rogress in neurogenetics, neuroimaging and other areas of
neuroscience is beginning to yield significant insights into mental
disorders.” Id. at 727.

(121.) Joseph R. Simpson, Introduction to NEUROIMAGING IN FORENSIC
PSYCHIATRY, supra note 25, at xv, xvii.

(122.) Melissa Lamar et al., Dementia, in NEUROIMAGING IN FORENSIC
PSYCHIATRY, supra note 25, at 67, 69.

(123.) Nathaniel E. Anderson & Kent A. Kiehl, The psychopath
magnetized: insights from brain imaging, 16 TRENDS COGNITIVE SCI. 52,
54-56 (2012); Andrea L. Glenn & Adrian Raine, Psychopathy and
instrumental aggression: Evolutionary, neurobiological, and legal
perspectives, 32 INT’L J.L. & PSYCHIATRY 253, 255-57 (2009);
Kent Kiehl, Can neuroscience identify psychopaths?, in A JUDGE’S
GUIDE TO NEUROSCIENCE: A CONCISE INTRODUCTION 47, 49 (Andrew S.
Mansfield ed., 2010), available at
http://www.sagecenter.ucsb.edu/sites/staging.sagecenter.ucsb.edu/files/file-and- multimedia/A_Judges_Guide_to_Neuroscience%5Bsample%5D.pdf.

(124.) Jazmin Camchong & Angus W. MacDonald III, Imaging
Psychoses: Diagnosis and Prediction of Violence, in NEUROIMAGING IN
FORENSIC PSYCHIATRY, supra note 25, at 113, 115.

(125.) Ronald S. Duman & George K. Aghajanian, Synaptic
Dysfunction in Depression: Potential Therapeutic Targets, 338 SCI. 68,
68-69 (2012); L. Wang et al., A systematic review of resting-state
functional-MRI studies in major depression, 142 J. AFFECTIVE DISORDERS
6, 7 (2012).

(126.) Allen J. Frances & Thomas Widiger, Psychiatric
Diagnosis: Lessons from the DSM-IV Past and Cautions for the DSM-V
Future, 8 ANNU. REV. CLIN. PSYCH. 109, 112 (2012); see also Hyman, supra
note 120, at 725.

(127.) This discussion draws, in part, on Francis X. Shen &
Owen D. Jones, Brain Scans as Evidence: Truths, Proofs, Lies, and
Lessons, 62 MERCER L. REV. 861, 862, 865 (2011).

(128.) See id. at 865.

(129.) Id. See generally William G. Iacono, The Forensic
Application of “Brain Fingerprinting:” Why Scientists Should
Encourage the Use of P300 Memory Detection Methods, 8 AM. J. BIOETHICS
30, 30-32 (2008); Jones & Shen, supra note 127, at 865; Anthony
Wagner, Can neuroscience identify lies?, in A JUDGE’S GUIDE TO
NEUROSCIENCE, supra note 123, at 13, 16-18.

(130.) See William G. Iacono, Detection of Deception, in HANDBOOK
OF PSYCHOPHYSIOLOGY 688, 688-90 (John T. Cacioppo et al. eds., 3d ed.
2007).

(131.) For lengthier treatment, see, for example, Charles
Adelsheim, Functional Magnetic Resonance Detection of Deception: Great
as Fundamental Research, Inadequate as Substantive Evidence, 62 MERCER
L. REV. 885 (2011); Archie Alexander, Functional Magnetic Resonance
Imaging Lie Detection: Is a “Brainstorm” Heading Toward the
“Gatekeeper”?, 7 HOUS. J. HEALTH L. & POL’Y 1 (2007);
Giorgio Ganis & Julian Paul Keenan, The cognitive neuroscience of
deception, 4 SOC. NEUROSCIENCE 465 (2009); Henry T. Greely & Judy
Illes, Neuroscience-Based Lie Detection: The Urgent Need for Regulation,
33 AM. J.L. & MED. 377 (2007); Jones & Shen, supra note 25; John
B. Meixner, Liar, Liar, Jury’s the Trier? The Future of
Neuroscience-based Credibility Assessment in the Court, 106 Nw. U. L.
REV. 1451 (2012); Jane Campbell Moriarty, Visions of Deception:
Neuroimages and the Search for Truth, 42 AKRON L. REV. 739 (2009);
Frederick Schauer, Can Bad Science Be Good Evidence? Neuroscience, Lie
Detection, and Beyond, 95 CORNELL L. REV. 1191 (2010).

(132.) “[W]hat constitutes ‘deception’ or a
‘lie’ is a conceptual not an empirical question, and … the
criteria are behavioral not neurological. Certain brain states may be
causally necessary for deception, but they are not a sufficient
condition for deception.” Pardo & Patterson, supra note 35, at
1230.

(133.) See, e.g., Schauer, supra note 131, at 1201. But see Joshua
D. Greene & Joseph M. Paxton, Patterns of neural activity associated
with honest and dishonest moral decisions, 106 PROC. NAT’L ACAD.
SCI. 12506, 12506 (2009) (describing study involving genuine
dishonesty).

(134.) Nancy Kanwisher, The Use of fMRI in Lie Detection: What Has
Been Shown and What Has Not, in USING IMAGING TO IDENTIFY DECEIT:
SCIENTIFIC AND ETHICAL QUESTIONS 7, 12 (2009), available at
http://www.amacad.org/pdfs/deceit.pdf.

(135.) See generally Logothetis, supra note 3.

(136.) See Frank Tong & Michael S. Pratte, Decoding Patterns of
Human Brain Activity, 63 ANN. REV. PSYCHOL. 483, 497 (2012) (“As an
example, it is well established that the human amygdala responds more
strongly to fear-related stimuli than to neutral stimuli, but it does
not logically follow that if the amygdala is more active in a given
situation that the person is necessarily experiencing fear. If the
amygdala’s response varies along other dimensions as well, such as
the emotional intensity, ambiguity, or predictive value of a stimulus,
then it will be difficult to make strong inferences from the level of
amygdala activity alone.”) (citations omitted).

(137.) Id. at 502 (citation omitted).

(138.) Those companies are Cephos Corporation and No Lie MRI, Inc.
See Greely & Illes, supra note 131, at 390-95.

(139.) See United States v. Semrau, No. 07-10074 M1/P., 2010 WL
6845092, at *3 (W.D. Tenn. 2010) (describing expert testimony of Dr.
Steven Laken, president and CEO of Cephos Corporation); Wilson v.
Corestaff Servs., L.P., 900 N.Y.S.2d 639, 640 (Sup. Ct. 2010) (same).

(140.) Semrau, 2010 WL 6845092, at *14.

(141.) fMRI lie detection evidence from Cephos was not admitted
under the Daubert standard in Semrau. fMRI lie detection evidence from
No Lie MRI was not admitted under the Frye standard in Smith v. State,
32 A.3d 59 (Md. 2011) (discussion of fMRI and Frye standard found in
trial court opinion, on file with author).

(142.) Semrau, 2010 WL 6845092, at *12 n.18.

(143.) See Marcus Raichle, What is an fMRI?, in A JUDGE’S
GUIDE TO NEUROSCIENCE, supra note 123, at 5, 8 (describing procedures
used in fMRI experiments).

(144.) And likely never will, at least for purposes of detecting
deception or honesty on a particular question. Resting-state studies are
becoming more common, and offer us “many interesting observations
of the way in which spontaneous connectivity patterns alter under
different conditions.” David M. Cole et al., Advances and pitfalls
in the analysis and interpretation of resting-state FMRI data, 4
FRONTIERS SYS. NEUROSCIENCE, no. 8, 2010, at 12. But without
corresponding non-resting state studies to complement the resting-state
analysis, “the concrete meaning of these inherent processes …
remains elusive.” Id. Resting-state activity on its own is
“something of an interpretative minefield.” Id. Given all of
the challenges of using task-based fMRI to assess lying on particular
questions, and the reliance of resting-state analysis on such task-based
studies, it seems very unlikely that resting-state analysis provides an
answer to the brain-based lie detection challenge.

(145.) The methodology they used is termed multi-voxel pattern
analysis (MVPA). See Rissman et al., infra note 146, at 9849.

(146.) Jesse Rissman et al., Detecting individual memories through
the neural decoding of memory states and past experience, 107 PROC.
NAT’L ACAD. SCI. 9849, 9849 (2010).

(147.) Id.

(148.) Id. at 9849, 9852.

(149.) Id. at 9852.

(150.) Id. at 9853.

(151.) See Lawrence A. Farwell, Brain fingerprinting: a
comprehensive tutorial review of detection of concealed information with
event-related brain potentials, 6 COGNITIVE NEURODYNAMICS 115, 115
(2012).

(152.) See JAMIE WARD, THE STUDENT’S GUIDE TO COGNITIVE
NEUROSCIENCE 37-39 (2d ed. 2010).

(153.) Id.

(154.) See Farwell, supra note 151, at 127.

(155.) See id.

(156.) The P300 wave is so named because it is a peak wave
appearing 300 milliseconds after the stimulus. The P300 approach has
also been used, with off-the-shelf EEG-based gaming equipment, in
attempts to “hack” subjects’ brains by identifying
subjects’ regional home location, month of birth, and first digit
of bank PIN number. Sara Grossman, UC Berkeley researchers investigate
‘Brain Hacking’: Gaming interface that records brain waves
could present future security threat, THE DAILY CALIFORNIAN, Oct. 14,
2012, http://www.dailycal.org/2012/10/14/brainhacking-possible-security-threat-of-the- future-say-uc-berkeley-researchers/.

(157.) J. Peter Rosenfeld, P300 in detecting concealed information,
in MEMORY DETECTION: THEORY AND APPLICATION OF THE CONCEALED INFORMATION
TEST 63, 64 (Bruno Verschuere et al. eds., 2011) (emphasis omitted).

(158.) Id. at 65. There are good reasons for using the P300
technique in certain circumstances. See Iacono, supra note 130, at 688
(“Lie detection techniques have been developed to detect two types
of liars: criminals and untrustworthy employees.”).

(159.) See, e.g., J. Peter Rosenfeld et al., Deception awareness
improves P300-based deception detection in concealed information tests,
86 INT’L J. PSYCHOPHYSIOLOGY 114, 115 (2012).

(160.) See generally Farwell, supra note 151.

(161.) Slaughter v. State, 108 P.3d 1052, 1054 (Okla. Crim. App.
2005); Harrington v. State, 659 N.W.2d 509, 516 (Iowa 2003). In the
Slaughter case, Farwell testified in an affidavit that the defendant did
not possess the knowledge that he would expect the perpetrator to have.
John G. New, If You Could Read My Mind: Implications of Neurological
Evidence for Twenty-First Century Criminal Jurisprudence, 29 J. LEGAL
MED. 179, 190 (2008). But “although Farwell indicated in his
affidavit that a ‘comprehensive report’ of his analysis would
be presented to the court detailing the method of analysis and the
results obtained, no such report was submitted in the course of that
hearing or a subsequent hearing.” Id.

(162.) See, e.g., J. Peter Rosenfeld, ‘Brain
Fingerprinting’: A Critical Analysis, 4 SCI. REV. MENTAL HEALTH
PRAC. 20, 34 (2005); J. Peter Rosenfeld et al., Simple, effective
countermeasures to P300-based tests of detection of concealed
information, 41 PSYCHOPHYSIOLOGY 205, 205 (2004). In 2012, Farwell
published a lengthy summary of the brain fingerprinting technique.
Farwell, supra note 151. But this publication was also heavily
criticized by fellow scholars in the field, including a former Farwell
coauthor. See, e.g., Ewout H. Meijer et al., A comment on Farwell
(2012): brain fingerprinting: a comprehensive tutorial review of
detection of concealed information with event-related brain potentials,
COGNITIVE NEURODYNAMICS, Aug. 14, 2012, at 4 (2012) (“[I]f Dr.
Farwell is, as he claims to be, a ‘brain fingerprinting
scientist’ he should feel obligated to retract the article.”).
There are other types of ERP methods that are well received in the
scholarly community and could be legally useful in ways that brain
fingerprinting might not. Id. at 3 (noting that “many researchers
… share a positive view towards the use of ERPs for the detection of
concealed information.” (citation omitted)). For instance, John
Meixner and Peter Rosenfeld have shown in the lab that a P300-based
Concealed Information Test can help detect, with no false positives,
guilty subjects in a mock terrorism paradigm. John B. Meixner & J.
Peter Rosenfeld, A mock terrorism application of the P300-based
concealed information test, 48 PSYCHOPHYSIOLOGY 149, 153 (2010). Such
research might one day have applications in counterterrorism efforts.
See id.

(163.) Meijer et al., supra note 162, at 2 (citation omitted).

(164.) Id.

(165.) Farwell, supra note 151, at 127.

(166.) GOVERNMENT WORKS, BRAINWAVE SCIENCE: EXECUTIVE SUMMARY 4
(n.d.), available at
http://www.governmentworks.com/bws/brochure/BrainFingerprinting
ExecutiveSummary.pdf.

(167.) Farwell, supra note 151, at 130.

(168.) Id. at 124. Farwell writes specifically that this pre-test
investigative phase is not a science.

   The investigative phase of preparing the brain fingerprinting test
   discovers the salient features of the crime that are used as probe
   stimuli. It depends on the skill and judgment of the criminal
   investigator. This is not a scientific process.

      The scientific phase of brain fingerprinting testing begins
   after the investigation has identified appropriate probes.

Id. at 133.

(169.) See K.M. O’Craven & N. Kanwisher, Mental Imagery of
Faces and Places Activates Corresponding Stimulus-Specific Brain
Regions, 12 J. COGNITIVE NEUROSCIENCE 1013, 1017 (2000).

(170.) Tom M. Mitchell et al., Predicting Human Brain Activity
Associated with the Meanings of Nouns, 320 SCI. 1191, 1194 (2008).

(171.) Francis X. Shen & Owen D. Jones, Brain Scans as
Evidence: Truths, Proofs, Lies, and Lessons, 62 MERCER L. REV. 861, 865
(2011). For more detail, see Owen D. Jones, Joshua W. Buckholtz, Jeffrey
D. Schall & Rene Marois, Brain Imaging for Legal Thinkers: A Guide
for the Perplexed, 2009 STAN. TECH. L. REV. 5 (2009).

(172.) JONES, SCHALL & SHEN, supra note 29, at 10.

(173.) Id.

(174.) See id.

(175.) Thomas Naselaris et al., Encoding and Decoding in fMRI, 56
NEUROIMAGE 400, 401 (2011) (“Voxel-based encoding models predict
activity in single voxels that is evoked by different sensory, cognitive
or task conditions.”).

(176.) See id.

(177.) See Daisy Yuhas, What’s a Voxel and What Can It Tell
Us? A Primer on fMRI, SCI. AM., June 21, 2012,
http://blogs.scientificamerican.com/observations/2012/
06/21/whats-a-voxel-and-what-can-it-tell-us-a-primer-on-fmri/.

(178.) Naselaris, supra note 175.

(179.) Id.

(180.) See, e.g., Getting a glimpse into the “movies in our
minds,” CBS NEWS, Sept. 23, 2011,
http://www.cbsnews.com/2100-205_162-20110768.html. For the original
study, see Shinji Nishimoto et al., Reconstructing Visual Experiences
from Brain Activity Evoked by Natural Movies, 21 CURRENT BIOLOGY 1641
(2011).

(181.) Reconstructing visual experiences from brain activity evoked
by natural movies, GALLANT LAB AT UC BERKELEY,
https://sites.google.com/site/gallantlabucb/
publications/nishimoto-et-al-2011 (last visited Jan. 31, 2013) (emphasis
added).

(182.) Id. One additional note of caution is that even if
researchers become sufficiently accurate in decoding activity in the
visual cortex of the brain, the decoding strategy will be unique to the
visual cortex and not akin to an all-purpose decoder ring to be used for
decoding all of our thoughts. This is partly because of the unique
structure of the visual cortex, see Se-Bum Paik & Dario L. Ringach,
Link between orientation and retinotopic maps in primary visual cortex,
109 PROC. NAT’L ACAD. SCI. 7091, 7091 (2012) (describing structure
of visual cortex), and partly because we have relatively exceptional
knowledge of how vision translates to cortical activity, see, e.g.,
Peter R. Huttenlocher, Morphometric Study of Human Cerebral Cortex
Development, 28 NEUROPSCYHOLOGIA 517, 517 (1990) (already noting in 1990
that “[e]xtensive data are now available for specific cortical
areas, especially for primary visual cortex”). The primary visual
cortex, known as “Area VI,” is a particularly well-studied
structure. See, e.g., Tianyi Yan et al., Correlated Size Variations
Measured in Human Visual Cortex V1/V2/V3 with Functional MRI, in BRAIN
INFORMATICS 36, 37 (N. Zhong et al. eds., 2009) (“Area V1 is the
human visual cortical area with the most well-defined anatomical
boundaries, agreed on by virtually all previous studies, both historical
and more recent.”) (citations omitted). Studies have shown that
Area V1 has a fairly simple structure: “To a good approximation,
each two-dimensional … location in the visual field is represented at
a single physical location within VI.” Id. In contrast, neural
representation of legally relevant mental states such as intention,
memory, and emotion is not nearly as well understood. Other areas of the
cortex, such as the prefrontal cortex and the ventromedial prefrontal
cortex, are structured very differently and encode information more
abstractly. See Adam P. Steiner & A. David Redish, The road not
taken: neural correlates of decision making in orbitofrontal cortex, 6
FRONTIERS NEUROSCIENCE, no. 13, 2012, at 12. In sum, then, the progress
made in decoding signals in the visual cortex is important and
informative, but this does not put us on the verge of highly effective,
legally relevant neuroimaging mind reading, because decoding the visual
cortex does not readily (or speedily) lead to decoding the multitude of
other brain structures that enable the mind.

(183.) Tong & Pratte, supra note 136, at 484.

(184.) Id. at 485.

(185.) Researchers are working around the world on these types of
questions. See, e.g., Training Computers to Understand the Human Brain,
SCIENCEDAILY, Oct. 5, 2012,
http://www.sciencedaily.com/releases/2012/10/121005134328.htm.

(186.) Tong & Pratte, supra note 136, at 502.

(187.) Id. at 485.

(188.) Id.

(189.) It’s Not Mind-Reading, but Scientists Exploring How
Brains Perceive the World (PBS NewsHour television broadcast Jan. 2,
2012) (transcript and video available at
http://www.pbs.org/newshour/bb/science/jan-june12/neuroscience_01-02.html).

(190.) See, e.g., Michael S. Pardo, Neuroscience Evidence, Legal
Culture, and Criminal Procedure, 33 AM. J. CRIM. L. 301, 333 (2005-2006)
(“[T]he hypothetical neuroscience examples serve an important
analytical purpose in testing theoretical accounts of the
privilege.”).

(191.) I have not fully reported on the nuances of each
author’s position. Rather, I have attempted to gauge, based on a
reading of the complete piece, how a particular scholar would come out
(in general) on the protections against involuntary,
government-initiated neuroimaging mind reading or lie detection. I
include lie detection in the Table, even though I do not deem it mind
reading, because so many of the authors have so labeled it.

(192.) Farahany, Incriminating Thoughts, supra note 15.

(193.) Id. at 408.

(194.) Although doctrinally distinct, the two Amendments’
protections may be interrelated. For example, Michael Pardo argues that
“the self-incrimination privilege applies to a subset of events
within the universe of potential Fourth Amendment events.” Michael
S. Pardo, Disentangling the Fourth Amendment and the Self-Incrimination
Clause, 90 Iowa L. Rev. 1857, 1860 (2005). In contrast, John New
suggests that “if evidence of mental activity is considered
testimonial, the strictures of the Fourth Amendment are inapplicable
because searches, even of bodily evidence such as hair or blood, are
searches for physical evidence,” and thus the Fifth Amendment
protections against self-incrimination are the “appropriate frame
of analytical reference.” New, supra note 161, at 197-98.

(195.) Samuel D. Warren & Louis D. Brandeis, The Right to
Privacy, 4 Harv. L. Rev. 193, 198 (1890); see also Long Beach City
Employees Ass’n v. City of Long Beach, 719 P.2d 660, 663 (1986)
(“If there is a quintessential zone of human privacy it is the
mind.”).

(196.) Jody C. Barillare, Comment, As Its Next Witness, the State
Calls … the Defendant: Brain Fingerprinting as “Testimonial”
Under the Fifth Amendment, 79 TEMP. L. REV. 971 (2006).

(197.) Id. at 974.

(198.) Boundy, supra note 13.

(199.) Id. at 1643.

(200.) Farahany, Incriminating Thoughts, supra note 15 (Fifth
Amendment); Farahany, Searching Secrets, supra note 15 (Fourth and Fifth
Amendments).

(201.) Farahany, Searching Secrets, supra note 15, at 1306. But see
id. at 1982 n.225 (suggesting that when biometric data would provide no
information other than suspect’s identity, data might not be
protected).

(202.) Farahany, Incriminating Thoughts, supra note 15, at 372,
374, 404.

(203.) Federspiel, supra note 84.

(204.) Id. at 874 n.56.

(205.) Id. at 892-94.

(206.) Dov Fox, The Right to Silence Protects Mental Control, in 13
CURRENT LEGAL ISSUES, supra note 25, at 335.

(207.) Id. at 335.

(208.) Holley, supra note 111.

(209.) Id. at 12-13.

(210.) Id. at 22.

(211.) Holloway, supra note 83.

(212.) Id. at 174-75.

(213.) Aaron J. Hurd, Note, Reaching Past Fingertips with Forensic
Neuroimaging–Non-“Testimonial” Evidence Exceeding the Fifth
Amendment’s Grasp, 58 LOY. L. REV. 213 (2012).

(214.) Id. at 217.

(215.) Murphy & Greely, supra note 17.

(216.) Id. at 650. Murphy and Greely also note a broader due
process claim that might be brought on the grounds that the neuroscience
techniques shock the conscience. See id. (citing Rochin v. California,
342 U.S. 165, 172-73 (1952)).

(217.) New, supra note 161.

(218.) Id. at 197-98. That said, “[i]t is questionable, [if]
by no means resolved, whether a test so intrusive as to mine human
thought or memory could ever be outweighed by a governmental interest in
obtaining evidence.” Id. at 197.

(219.) See id. at 194 (“It seems contradictory to both the
history and spirit of the Fifth Amendment, therefore, to permit the
state to execute an end run around an individual’s refusal to
communicate simply by extracting that information held within the brain
that the individual refuses to divulge.”).

(220.) Kristen M. Nugent, Neuroimaging and the Constitution, in
NEUROIMAGING IN FORENSIC PSYCHIATRY, supra note 25, at 275.

(221.) See id. at 295.

(222.) Id. at 292.

(223.) Pardo, supra note 190.

(224.) Id. at 325-26.

(225.) Id. at 331-32.

(226.) Amanda C. Pustilnik, Neurotechnologies At The Intersection
of Criminal Procedure and Constitutional Law, in The Constitution and
the Future of the Criminal Law (Song Richardson & John Parry eds.,
forthcoming 2013) (manuscript available at
http://papers.ssrn.com/so13/papers.cfm?abstract_id=2143187).

(227.) Id. (manuscript at 2, 19-20).

(228.) Id. (manuscript at 10).

(229.) Stoller & Wolpe, supra note 99, at 359.

(230.) Id. at 371, 374.

(231.) Erich Taylor, Note, A New Wave of Police Interrogation?
“Brain Fingerprinting,” the Constitutional Privilege Against
Self-Incrimination, and Hearsay Jurisprudence, 2006 U. ILL. J.L. TECH.
& POL’Y 287 (2006).

(232.) Id. at 307-08.

(233.) U.S. CONST. amend. IV.

(234.) 389 U.S. 347, 360-61 (1967) (Harlan, J., concurring)
(“[A] person has a constitutionally protected reasonable
expectation of privacy…. [T]here is a twofold requirement, first that
a person have exhibited an actual (subjective) expectation of privacy
and, second, that the expectation be one that society is prepared to
recognize as ‘reasonable.'”). Katz remains the
“lodestar” of Fourth Amendment privacy jurisprudence. Smith v.
Maryland, 442 U.S. 735, 739 (1979).

(235.) See 442 U.S. at 739-40.

(236.) Id. at 740.

(237.) Id. (quoting Katz, 389 U.S. at 361).

(238.) Id. (citing Katz, 389 U.S. at 353 (majority opinion)).

(239.) See Orin S. Kerr, Four Models of Fourth Amendment
Protection, 60 Stan. L. Rev. 503, 519 (2007) (“[T]he reasonable
expectation of privacy inquiry poses a policy question: should a
particular set of police practices be regulated by the warrant
requirement or should those practices remain unregulated by the Fourth
Amendment? If the consequences of leaving conduct unregulated are
particularly troublesome to civil liberties, then that conduct violates
a reasonable expectation of privacy. On the other hand, if the practical
consequences of regulating such conduct unnecessarily restrict
government investigations given the gain to civil liberties protection,
then any expectation of privacy is constitutionally unreasonable….
[I]t is widely agreed that something akin to the policy model helps
frame the basic goals of Fourth Amendment law and the reasonable
expectation of privacy test.”). In the context of brain-based mind
reading, legal scholar John New has posed the problem this way: “To
what extent might evidence obtained as a result of the measure of mental
activity be protected from search or seizure? On the other hand, to what
extent might the extraction of knowledge or memories be considered
reasonable?” New, supra note 161, at 195.

(240.) Pardo, supra note 190, at 325.

(241.) Id. at 325 n.160 (citing Skinner v. Ry. Labor
Executives’ Ass’n, 489 U.S. 602, 615-18 (1989) (urine test);
Schmerber v. California, 384 U.S. 757, 766-72 (1966) (blood test)).

(242.) Pustilnik, supra note 226, at 15-16.

(243.) See U.S. CONST. amend. IV. If a warrant is issued:

   The use of such a warrant might (or might not) be limited by the
   privilege against self-incrimination or by some constitutional
   privacy right, but, if such rights did not apply, would such
   warrants allow our brains to be searched? This is, in a way, the
   ultimate result of the revolution in neuroscience, which identifies
   our incorporeal 'mind' with our physical 'brain' and allows us to
   begin to draw inferences from the brain to the mind. If the brain
   is a physical thing or a place, it could be searchable, even if the
   goal in searching it is to find out something about the mind,
   something that, as a practical matter, had never itself been
   directly searchable.

Greely & Wagner, supra note 25, at 796.

(244.) Pardo, supra note 190, at 325-26.

(245.) Pardo, supra note 190, at 327-28. Pustilnik also considers
whether the state could obtain memory data from a non-suspect (since the
friend’s information would help the investigation). Pustilnik,
supra note 226, at 17-19.

(246.) See United States v. R. Enterprises, Inc., 498 U.S. 292,
297, 302 (1991) (rejecting probable cause requirement for issuance of
grand jury subpoena and directing trial court to instead balance
interests of subpoena recipient against government interests). But see
In re Grand Jury Proceedings (T.S.), 816 F. Supp. 1196, 1205-06 (W.D.
Ky. 1993) (rejecting government request for grand jury subpoena to
obtain blood sample from defendant, and requiring that government
instead establish probable cause in order to obtain a warrant).

(247.) Pustilnik, supra note 226 (manuscript at 19).

(248.) Schmerber v. California, 384 U.S. 757, 758 (1966); see infra
Part III.C.

(249.) See, e.g., Tong & Pratte, supra note 136, at 497
(discussing problems in “reverse inference” from fMRI data).

(250.) Cf. United States v. Semrau, No. 07-10074 M1/P., 2010 WL
6845092, at *13-*14 (W.D. Tenn. 2010) (excluding fMRI-based expert
testimony under the Daubert test because of widespread scientific doubt
about the probity of fMRI data).

(251.) See Farahany, Searching Secrets, supra note 15, at 1240.
Professor Farahany has also explored the Fifth Amendment in a companion
piece. See Farahany, Incriminating Thoughts, supra note 15, at 351.

(252.) Farahany, Searching Secrets, supra note 15, at 1270.

(253.) Id. at 1271.

(254.) John Chambers et al., Developments in active noise control
sound systems for magnetic resonance imaging, 68 APPLIED ACOUSTICS 281,
281 (2007). See also Decibel (Loudness) Comparison Chart, GALEN CAROL
AUDIO, http://www.gcaudio.com/ resources/howtos/loudness.html (last
visited Feb. 7, 2013).

(255.) See Sarah E. Stoller & Paul Wolpe, Emerging
Neurotechnologies for Lie Detection and the Fifth Amendment, 33 AM. J.L.
& MED. 359, 362 (2007) (“The subject is seated in front of a
computer screen and wears a headband with sensors that measure EEG
responses at several locations on the scalp.”).

(256.) U.S. CONST. amend. V.

(257.) One commentator, for example, has called the privilege
against self-incrimination an “unsolved riddle of vast proportions,
a Gordian knot in the middle of our Bill of Rights.” See Fox, supra
note 206, at 768 (quoting Akhil Reed Amar & Renee B. Lettow, Fifth
Amendment First Principles: The Self-Incrimination Clause, 93 MICH. L.
REV. 857, 857 (1995)) (internal quotation marks omitted). Even the
Supreme Court has noted that it is unclear “just what [the
privilege] is supposed to do or just whom it is intended to
protect.” Id. (quoting Murphy v. Waterfront Comm’n of N.Y.
Harbor, 378 U.S. 52, 56 n.5 (1964)) (internal quotation marks omitted).

(258.) 1 WAYNE R. LAFAVE ET AL., CRIMINAL PROCEDURE [section]
2.10(a) (3d ed. 2000) (Westlaw database update 2011).

(259.) See, e.g., Fox, supra note 206, at 779 (“Whether brain
fingerprinting is privileged by right-to-silence jurisprudence turns on
whether it counts as ‘testimonial’ evidence, which is
protected by the Fifth Amendment, or ‘physical’ evidence,
which is not.”); Holley, supra note 110, at 19 (“Ultimately,
determining whether NTLD evidence is admissible ‘physical’
evidence or inadmissible ‘testimonial’ evidence boils down to
which analogy is more apt: is NTLD more like speech or more like a blood
sample?”); New, supra note 161, at 193 (“An initial question
is whether results of brain activity measurement should be considered by
the legal system to be physical evidence or actual testimony by the
individual. The consequences and application of legal tests of
admissibility will depend upon which of the categories (if either)
‘mind activity’ is deemed to be.”); Pardo, supra note
190, at 321-22 (“On the one hand, the fMRI lie detector and the
‘brain fingerprinting” technique share similarities with other
physical examinations such as blood tests, breathalyzer tests, and
fingerprint tests, which may be compelled under certain circumstances.
On the other hand, the neuroscience tests arguably are qualitatively
different in that they compel inductive evidence of mental events,
beliefs, thoughts, and propositional knowledge. How this tension is
resolved will depend on how both the evidence and the constitutional
protections are conceptualized.”); Pustilnik, supra note 226
(manuscript at 6) (“Whether and under what circumstances the state
could compel individuals to submit to neuroassays under existing Fourth
and Fifth Amendment jurisprudence depends on how neuroassays and the
brain products they detect are characterized…. Whether courts construe
their physical sample-like properties or their information product-like
properties to predominate would lead to different degrees of protection
under each regime.”).

(260.) I restrict my analysis here to whether the government could
compel this type of evidence, in the absence of a knowing and voluntary
agreement.

(261.) Farahany, Incriminating Thoughts, supra note 15, at 351;
Pustilnik, supra note 226 (manuscript at 1) (arguing that such a
distinction “stumbles on a conceptually limited distinction between
body and mind, physical and informational” and that “[s]uch a
distinction can no longer stand, as brain processes and emanations sit
at the juncture of these categories”).

(262.) Fox, supra note 206, at 801 (quoting Schmerber v.
California, 384 U.S. 757, 764 (1966)).

(263.) 384 U.S. at 757.

(264.) Id. at 758.

(265.) Id.

(266.) Id. at 759.

(267.) Id. at 764.

(268.) Id.

(269.) People v. Sallow, 165 N.Y. Supp. 915, 924 (1917).

(270.) Id.

(271.) Greely & Wagner, supra note 25, at 791.

(272.) Nugent makes this point as well: “Of course, the
distinction between physical and testimonial evidence may be moot with
respect to those neuroimaging techniques that require the subject to
verbally respond to an investigator’s questions.” Nugent,
supra note 220, at 293.

(273.) 496 U.S. 582 (1990).

(274.) See Pustilnik, supra note 226 (manuscript at 6-10).

(275.) Muniz, 496 U.S. at 585-86.

(276.) See id. at 586 (officers questioned drunk driving suspect as
to whether he recalled the date of his sixth birthday).

(277.) Id. at 593 (quoting Brief for Petitioner at 21) (alteration
in original).

(278.) Id. at 593-94.

(279.) Id. at 595 (quoting Doe v. United States, 487 U.S. 201, 213
(1988)) (internal quotation marks omitted).

(280.) Pustilnik, supra note 226 (manuscript at 9-10).

(281.) In re Grand Jury Subpoena Duces Tecum Dated May 9, 1990, 741
F. Supp. 1059, 1069 n.19 (S.D.N.Y. 1990).

(282.) Id. at 1069.

(283.) Id.

(284.) Farahany, Incriminating Thoughts, supra note 15, at 366.
Professor Farahany correctly suggests that static information such as a
structural scan (like a CT scan) would be allowed because, in comparable
situations, “the Court has held that the privilege against
self-incrimination does not protect an accused from compelled submission
to physical testing for identifying evidence.” Id. at 370. This
Article also categorizes static information scans as brain reading (not
mind reading) evidence, which does not receive Fifth Amendment
protection. Professor Farahany’s analysis also suggests the same
result for functional scanning, such as the PET scan, of automatic brain
functioning (for instance, to see if the defendant’s brain is
functioning normally a few weeks after a traumatic event). See id. Here
I would distinguish not between automatic and non-automatic, but between
brain reading and mind reading. A test case might be an automatic
emotional response that we know is likely to be elicited from a certain
stimulus, though this may be an argument over the semantics of the word
“automatic.”

(285.) Id. at 406.

(286.) Id. Farhany’s suggested replacement is a spectrum that
includes “identifying, automatic, memorialized, and uttered”
types of evidence, with the “privilege against self-incrimination
most naturally protect[ing] a defendant from being compelled to utter
new evidence by which his condemnation will be secured.” Id. at
400.

(287.) See Farahany, Searching Secrets, supra note 15, at 1282.

(288.) One additional area that I do not reach here is the use of
brain data in determinations of future dangerousness. Predicting
dangerousness is fraught with difficulty, and is an area where we should
tread carefully. One review of the research comes to the conclusion that
neuroimaging may be an extra tool, but not a replacement tool, for
clinical evaluation. See J.W. Looney, Neuroscience’s New Techniques
for Evaluating Future Dangerousness: Are We Returning to Lombroso’s
Biological Criminality?, 32 U. ARK. LITTLE ROCK L. REV. 301, 307-08, 310
(2010).

(289.) For one of the few recent treatments, see Michael L. Perlin,
“And I See Through Your Brain”: Access to Experts, Competency
to Consent, and the Impact of Antipsychotic Medications in Neuroimaging
Cases in the Criminal Trial Process, 2009 STAN. TECH. L. REV. 4 (2009),
available at http://stlr.stanford.edu/pdf/perlin-and-i see.pdf. Perlin
also notes that the issue of competency is underexamined in the
neuroimaging literature, Id. [paragraph] 4.

(290.) Note, Requiring a Criminal Defendant to Submit to a
Government Psychiatric Examination: An Invasion of the Privilege Against
Self-Incrimination, 83 HARV. L. REV. 648, 648 n.1 (1970) (“In the
great majority of cases, the defendant will assert any incompetency or
insanity ‘defense.’ However the state will sometimes raise the
issue of the accused’s incompetency.”).

(291.) 18 U.S.C. [section] 4241(a) (2006).

(292.) Id. [section] 4241(b).

(293.) Dusky v. United States, 362 U.S. 402, 402 (1960) (per
curiam).

(294.) See, e.g., People v. Tolefree, 960 N.E.2d 27 (Ill. App. Ct.
2011).

(295.) Nugent, supra note 220, at 277 (noting that “increasing
deference given during competency hearings to reliable and relevant
neuroimaging evidence … is emerging in modern case law.”).

(296.) Id. at 278.

(297.) Nugent reaches the same conclusion when she argues that
“[g]iven the shortcomings of traditional psychological testing,
particularly with respect to defendants who are skilled at disguising
their true mental status, utilizing an array of methodologies will (at
least sometimes) be the preferable approach in competency
determinations.” Id.

(298.) Estelle v. Smith, 451 U.S. 454, 468 (1981).

(299.) AM. BAR ASS’N, CRIMINAL JUSTICE STANDARDS 7-4.6,
available at www.americanbar.org/publications/criminal_justice_section_archive/crimjust_standards_mentalhealth_blkold.html.

(300.) In Delaware, for instance,

   No person who has been convicted of and imprisoned for any class A
   felony, felony sex offense or any felony wherein death or assault
   to a victim occurred shall be released from incarceration by the
   Parole Board until the Parole Board has considered a mental health
   evaluation of such person. The Parole Board, in its discretion, may
   request mental health evaluations on persons convicted and
   imprisoned for any offense not enumerated [in the code].

DEL. CODE ANN. tit. 11, [section] 4353(a) (2012).

(301.) Cf. Nugent, supra note 220, at 297 (“If a criminal
defendant introduces neuroimaging evidence during trial … it is
reasonable to wonder whether that same neuroimaging evidence could be
either resurrected itself or used as impetus for additional neurological
testing before the defendant’s release….”).

(302.) Charles R. Honts & Mary V. Perry, Polygraph
Admissibility: Changes and Challenges, 16 L. & HUM. BEHAV. 357,
357-58, 362 (1992).

(303.) Pub. L. No. 100-347, 102 Stat. 646 (1988) (codified at 29
U.S.C. [section][section] 2001-09 (2006)).

(304.) See Yvonne Koontz Sening, Note, Heads or Tails: The Employee
Polygraph Protection Act, 39 CATH. U. L. REV. 235, 236 (1989) (noting
exemptions from the Act for government employers, national defense and
security employers, file FBI, and employers involved in the manufacture,
distribution, or dispensation of controlled substances).

(305.) See, e.g., Meixner & Rosenfeld, supra note 162, at 153.

(306.) Farwell, supra note 151, at 144.

(307.) See generally infra Part V.C. Many researchers are
skeptical. For example, Tong and Pratte suggest that, “[g]iven the
conceptual challenges of developing reliable fMRI lie detection and the
fact that people can use countermeasures to alter their patterns of
brain activity, [it is] doubtful that the technology will progress to
being truly reliable” in the foreseeable future. Tong & Pratte,
supra note 136, at 503.

(308.) Murphy and Greely predict that there are “good reasons
to expect neuroscience-based mind reading to hit major technical
barriers before it reaches impressive levels of detail.” Murphy
& Greely, supra note 17, at 636.

(309.) Yuhas, supra note 177, at 2.

(310.) See, e.g., Tong & Pratte, supra note 136, at 498-99.
“In studies of higher-level cognition, predefined regions of
interest usually are not available, and multiple distributed brain areas
might be involved in the cognitive task.” Id. at 499.

(311.) See, e.g., Amber Dance, Notion in Motion: Wireless Sensors
Monitor Brain Waves on the Fly, SCI. AM., Jan. 27, 2012,
http://www.scientificamerican.com/
article.cfm?id=wireless-brain-wave-monitor.

(312.) Sergio Fantini et al., Monitoring brain activity using
near-infrared light, 33 AM. LAB. 15, 15 (2001); Gary Strangman et al.,
Non-Invasive Neuroimaging Using Near-Infrared Light, 52 BIOLOGICAL
PSYCHIATRY 679, 679 (2002).

(313.) Strangman et al., supra note 313.

(314.) Id. at 680.

(315.) Mateo Calderon-Arnulphi, Ali Alaraj, & Konstantin V.
Slavin, Near Infrared Technology in Neuroscience: Past, Present and
Future, 31 NEUROL. RES. 605, 606-07 (2009) (“For typical absorption
and scattering values of the human head and source-to-detector distances
of 4 cm, a depth of the brain cortex of at least .3 cm is
monitored.”).

(316.) Infrared technology is also the backbone for some popular
breathalyzer tools.

   An infrared breath testing instrument takes a picture/spectra of
   the alcohol present on the individuals [sic] breath. An infrared
   "picture" of an organic compound can positively identify the
   compound to the exclusion of all other organic compounds. A
   fingerprint identifying a person is analogous to an infrared
   "picture" identifying an organic compound. Not only can infrared
   technology be used to identify ethanol, it can also be used to
   determine how much alcohol is present in the breath sample. This
   quantitative ability of infrared technology is what makes infrared
   such a valuable tool to law enforcement. Infrared breath testing
   instruments have been used by law enforcement since the early
   1980's.

STEPHEN L. JONES, DRUNK DRIVING DEFENSE, 50 MASS. PRACTICE SERIES:
DRUNK DRIVING DEFENSE app. B-1 [section] 6.2 (2012).

(317.) Martha J. Farah et al., Brain Imaging and Brain Privacy: A
Realistic Concern?, 21 J. COGNITIVE NEUROSCIENCE 119 (2008).

(318.) Id. at 124.

(319.) Murphy & Greely, supra note 17, at 636.

(320.) Id. at 649.

(321.) GAZZANIGA, supra note 17, at 9 (quoting T.M. Preuss, The
Discovery of Cerebral Diversity, in EVOLUTIONARY ANATOMY OF THE PRIMATE
CEREBRAL CORTEX 138, 138 (D. Falk & K. Gibson eds., 2001)) (internal
quotation marks omitted).

(322.) Id. at 246 (quoting DENIS BRIAN, GENIUS TALK, CONVERSATIONS
WITH NOBEL SCIENTISTS AND OTHER LUMINARIES 376 (1995)).

(323.) FiveBooks Interviews: Sebastian Seung on Identity and the
Mind, THE BROWSER (2012),
http://thebrowser.com/interviews/sebastian-seung-on-identity-and-mind.
See generally SEBASTIAN SEUNG, CONNECTOME: HOW THE BRAIN’S WIRING
MAKES US WHO WE ARE (2012).

(324.) See also Tong & Pratte, supra note 136, at 503 (noting
“major concerns” with the reliability of prospective lie
detection technology, and the need for “[m]uch more research”
on its validity).

(325.) As one commentator puts it, at present we do not “have
the faintest clue about the biggest mystery of all–how does a lump of
wet grey matter produce the conscious experience you are having right
now, reading this paragraph? How come the brain gives rise to the mind?
No one knows.” Poole, supra note 80.

Table 1. Distinguishing Mind Reading and Brain Reading
What type of conclusion is to be made?

                              Conclusion about     Conclusion about
                              mental functioning   brain tissue itself
                              (that is, "mind      (that is, "brain
                              reading ").          reading").

How is      Non-machine-      (1) Traditional      (2) Traditional
the brain   aided (for        mind reading (for    brain reading (for
data        example, direct   example, assessing   example, autopsy
collected?  observation by a  honesty by looking   by visual
            human).           into an              observation alone
                              individual's eyes).  to determine bullet
                                                   trajectory).

            Machine-aided     (3) Machine-aided    (4) Machine-aided
            but not neuroi-   mind reading (for    brain reading (for
            maging (for       example, using a     example, micro-
            example, com-     computer to admin-   scopic tissue exami-
            puter-assisted    ister neuropsy-      nation to determine
            assessment of     chogical battery of  if cause of death
            cognitive func-   questions)           was lead poisoning)
            tioning)

            Machine-aided     5) Machine-aided     (6) Machine-aided
            neuroimaging      neuroimaging mind    neuroimaging brain
            methods (for      reading (for exam-   reading (for exam-
            example, human    ple, fMRI lie        ple, an MRI scan to
            assisted by       detection or EEG     identify the
            fMRI, EEG, and    memory detection).   location of a
            soon).                                 tumor).

Table 2: Scholars Scorecard for Fourth and Fifth Amendment Protection
Against Involuntary Mind Reading with Neuroimaging by the Government

Scholars are presented alphabetically by last name of first author.

Scholar                Fourth Amendment         Fifth Amendment
                       Result                   Result

Jody Barillare         --                       Brain fingerprinting
(Comment) (196)                                 protected because it
                                                elicit "testimonial
                                                psychological
                                                responses." (197)

Mara Boundy            --                       Protected because fMRI
(Note) (198)                                    reveals contents of
                                                the mind. (199)

Nita Farahany (200)    Should be protected      Uncertain: not
                       because of a privacy     protected if static,
                       interest in secrecy of   structural  scanning
                       one's own thoughts.      of automatic
                       (201)                    functioning; protected
                                                if compelled
                                                utterance. (202)

William Federspiel     Unresolved because of    Uncertain because it
(Note) (203)           shifting conceptions     is not clear if the
                       of what is considered    evidence is physical
                       a reasonable search.     or testimonial. (205)
                       (204)

Dov Fox (206)          --                       Protected because de-
                                                fendant has a right to
                                                silence--to control
                                                his thoughts. (207)

Benjamin               Generally protected      Not protected if the
Holley (208)           because there is a       technology poses no
                       reasonable               undue risk of harm,
                       expectation of privacy   because neuroscience
                       and the devices are      lie detection evidence
                       not in general pub-      is physical, not testi-
                       lic use, but allowed     monial. (210)
                       in places where
                       warrant requirements
                       are relaxed, such as
                       borders and airports.
                       (209)

Matthew Baptiste       --                       Protected because it
Holloway (211)                                  is a physical invasion
                                                of individual autonomy
                                                and is testimonial.
                                                (212)

Aaron Hurd             --                       Not protected because
(Note) (213)                                    the procedure does not
                                                require a deliberate,
                                                controlled response
                                                and the resulting evi-
                                                dence is therefore non
                                                testimonial. (214)

Emily Murphy and       --                       Likely protected be
Hank Greely (215)                               cause it will be
                                                considered
                                                testimonial. (216)

John New (217)         Uncertain because the    Protected because it
                       balancing of an indi-    is testimonial
                       vidual's privacy         evidence. (219)
                       versus the
                       government's interest
                       will be fact specific.
                       (218)

Kristen                Protected because neu-   Protected because the
Nugent (220)           roimaging would be       purpose of the exam is
                       too great an interfer-   to draw inferences
                       ence with the            about the mind's con
                       accused's bodily         tents. (222)
                       autonomy. (221)

Michael Pardo (223)    Generally protected      Protected when "the
                       because the neurosci-    government compels
                       ence tests are           the tests in order to
                       searches and one has a   obtain evidence of the
                       reason-able              incriminating informa-
                       expectation of pri-      tional content of sub-
                       vacy regarding ones      jects' propositional
                       brain states. (224)      attitudes." (225)

Amanda                 Protected, because       Brain evidence pro-
Pustilnik (226)        there is a reasonable    tected when it reveals
                       expectation of privacy   mental content or
                       for brain activity.      knowledge. (228)
                       (227)

Sarah Stoller and      --                       Uncertain, as it de-
Paul Root                                       pends on how the Su-
Wolpe (229)                                     preme Court carries
                                                out its Fifth
                                                Amendment analysis.
                                                (230)

Erich Taylor           --                       Protected because it
(Note) (231)                                    closely resembles po-
                                                lice interrogation and
                                                is therefore
                                                testimonial evidence.
                                                (232)