One of the great tragedies — and travesties — in our court system is the use of “junk science.” Tort reformers love to squawk about “junk science” as lending credibility to plaintiffs’ claims ranging from asbestos-caused cancer to exploding tires. Concern about this led to the United States Supreme Court ruling, in a trilogy of cases — Daubert v. Merrell Dow Pharmaceuticals, 509 U.S. 579 (1993); General Electric Co. v. Joiner, 522 U.S. 136 (1997); and Kumho Tire Co. v. Carmichael, 526 U.S. 137 (1999) — that federal judges must serve as “gatekeepers,” screening scientific and technical testimony to see if it makes scientific sense.
The Supreme Court held that proffered scientific evidence must satisfy five criteria:
1. Empirical testing: the theory or technique must be falsifiable, refutable, and testable.
2. The theory or technique must have been subjected to peer review and publication.
3. The theory or technique must have a known or potential error rate that is acceptable.
4. There must be standards and controls concerning how the test or measurement is conducted.
5. The theory and technique should have been generally accepted by a relevant scientific community.
And the result of this line of cases has been, according to a study by the RAND Corporation, that many more civil plaintiffs are having their cases thrown out of federal court when their scientific evidence is excluded under Daubert.
Virginia, however, does not follow Daubert. Virginia courts exclude scientific evidence only when it fails the “general reliability” test enunciated in Spencer v. Commonwealth, 240 Va. 78, 97, 393 S.E.2d 609, 621 (1990).
When scientific evidence is offered, the court must make a threshold finding of fact with respect to the reliability of the scientific method offered, unless it is of a kind so familiar and accepted as to require no foundation to establish the fundamental reliability of the system, such as fingerprint analysis; or unless it is so unreliable that the considerations requiring its exclusion have ripened into rules of law, such as “lie-detector” tests; or unless its admission is regulated by statute, such as blood-alcohol test results.
For whatever reason (let’s not get started on politics here), standards such as those set out in Daubert or Spencer result in the exclusion of scientific evidence in civil cases, but almost never in criminal cases. Yet the news is full of stories of convictions obtained based on prosecutorial “junk science.” Evidence is strong that the state of Texas executed an innocent man when it killed Todd Willingham; he was alleged to have set fire to his own house, killing his children. The evidence of arson came from an arson investigator who testified that the burn patterns and the crazed glass indicated that a liquid accelerant was used. The investigator concluded that Willingham had set the fire. While his case was on appeal, forensic scientists determined that the burn patterns and crazed glass that were the “proof” of arson could perfectly well have been caused by a fire that spread without an accelerant, and that there was in fact no evidence of arson. Texas executed Willingham anyway. See the Wikipedia article for the story, with sources.
In recent days, Michael Morton (also in Texas) was released from prison 25 years after he was convicted of killing his wife. According to the story in The Texas Tribune:
Undigested bits of mushrooms and tomatoes from Christine Morton’s last meal — a celebratory birthday dinner she had with her husband — were still in her stomach when the medical examiner performed his autopsy in 1986.
Those remnants, the prosecutor told the jury during Michael Morton’s trial, “scientifically proved” that Morton had beaten his wife to death.
Twenty-five years later, DNA science revealed that someone else had actually killed Christine Morton and that her husband’s murder conviction and more than two decades in prison were a tragic mistake. His exoneration based on DNA evidence is the 45th in Texas.
In 2009, the National Research Council of the National Academy of Science’s Committee on Identifying the Needs of the Forensic Sciences Community published Strengthening Forensic Science in the United States: A Path Forward. A summary of the 352 page report can be found here. The report described serious problems with basic forensic science, including fingerprints, arson analysis, bullet fragment analysis and many other forms of forensic evidence commonly used to convict people beyond a reasonable doubt in this country. The report made 13 recommendations of ways that the forensic science system could be improved — most important among them that a National Institute of Forensic Science should be created, with the charge to do research on the causes of experimental and testing error, including human observer bias and sources of human error in forensic examinations.
No progress has been made on these proposals. And although civil plaintiffs have a terrible time admitting new kinds of scientific evidence, the government in criminal cases continues to be able to convict on the strength of shaky scientific evidence, confidently articulated by police witnesses who do not have enough grounding in the science of what they are doing to be able to recognize the limitations of their knowledge.