JUDY WOODRUFF: A major look at the FBI’s handling of forensic evidence is under way.
Yesterday, the Justice Department announced that it will conduct a review of thousands of criminal cases, dating back to 1985, where hair and fiber analysis led to convictions. The examinations were detailed in a front-page article in today’s Washington Post.
Included were the stories of two men wrongfully convicted on flawed hair analysis. They have since been exonerated by DNA testing, one just yesterday.
The Justice Department review comes after The Washington Post identified those two men and others as part of a report on forensic errors by FBI labs.
National reporter Spencer Hsu wrote today’s piece. And he joins us now from the Post newsroom.
And so this investigation of forensics mainly was triggered by flawed hair analysis; is that right?
SPENCER HSU, The Washington Post: That’s right, Judy.
The concern has been building for decades, really, that hair and other forensic disciplines have not had the scientific research to I guess validate or underpin their approach. In the case of hair, skeptics have raised the point that it might be subjective. A given hair examiner — two different examiners might describe the same hair in different ways. The same examiner might describe it differently at different times.
There was no agreement on how many characteristics had to be alike for there to be a match declared. There was no population studies or statistics to answer the critical question of how often the hairs of two different people might appear to be the same or how often a given number of characteristics might match.
To resolve these questions, the FBI has long said that a hair match only shows that you can exclude two people. My hair might not look like your hair. But it could only say that a hair from a crime scene looks like mine or it could have come from me or someone in a class of folks like me.
The challenge has been or the problem has been that, in trials, FBI examiners, other experts and prosecutors have been tempted to fill in the plank for juries, how likely is it, what are the odds, and they have gone to say things like two in 10 million or two in 10,000, invoke sort of bogus statistics to try and give an air of certainty. They have overstated the probative value of a match, without the statistical backing for it.
And they did this at a time when their own studies were showing — DNA testing showed that in 2002 examinations were wrong, matches were declared — declared matches were wrong 11 percent of the time.
They switched their standard in 1996 to begin backing up hair comparison matches with DNA testing. Yet, at the time, they were warned that agents were exaggerating their testimony. There was no regular checking-up on what agents were saying in court or on documenting their findings in a way that would make them testable, provable, repeatable, as the sort of scientific method requires.
JUDY WOODRUFF: Well, let me ask you, with so much at stake, how could the standards have been as loose as they were?
SPENCER HSU: It’s a difficult question.
I mean, repeatedly, over time, fingerprints started as a reform in the — I think in the 1920s, a way to apply sort of solid quantitative physical science to this notion of catching criminals. And over time, each step has come in as an improvement on the last. But it was only until DNA arose, was validated through scientific and medical practices, that people had — the scientists had developed sort of statistical bases for saying how often would someone have DNA that includes this number of matches.
When they compared the kind of rigor of DNA to things like fingerprint science or even hair science, they found that the same type of research hadn’t been done. So, at the time, we hear from agents and examiners this was the best we had, we thought it was good. It was only in ’96 when — or only in 2002 when we realized that we might be wrong 11 percent of the time.
The caveat is, is that what we’re finding is, in these cases dating to the late 1970s, early 80s, even at that time, the testimony was beyond what was scientifically acceptable at the time. And I think that’s what has prompted finally — I say finally — you know, there have been defense lawyers — the Innocence Project has been working on this for many years — who have tried to show through cases and examples.
And it looks like the policy-makers now have acknowledged that they had something to their concerns.
JUDY WOODRUFF: Can you just briefly give us an example of someone who was wrongfully convicted?
SPENCER HSU: You know, we have a couple in the District.
We wrote about Kirk Odom yesterday. Prosecutors apologized to him, said he should be exonerated. He was convicted of a 1981 rape on the basis of his — FBI testifying that his hair was found in the victim’s bed clothes. The victim six weeks after the case picked his picture out of a photo array.
And he was included in the photo array because a police officer talking to him about an unrelated matter thought he looked like the composite sketch. It turned out DNA testing showed it wasn’t his hair. Further, more precise DNA testing showed that only one man could have left the stains recovered from the crime scene. That man wasn’t Kirk Odom.
It was a convicted sex offender, it turns out. The examiner, the prosecutor misstated or turned out to be in error. This was the third case of three different men who were implicated by three different FBI hair agents that in the District alone, the nation’s capital, just 600,000 people, three men who were in prison between 20 and 30 years each on the basis of flawed evidence.
JUDY WOODRUFF: And very quickly, the type of crimes we’re talking about, murder, rape?
SPENCER HSU: Hairs tend to be taken from violent crimes, like murder, like rape. The penalties involved tend to be longer-term, 20 or 30 years, have implications like lifelong parole or sex registry — sex offender registry issues.
JUDY WOODRUFF: And of the thousands of cases that they are now going back and looking at again, is there some sense of how many mistakes, how many cases may be overturned — convictions, rather?
SPENCER HSU: You know, the last time there was a review like this, it was — they looked at 7,000 cases handled by 13 agents, including one hair and fiber examiner. He handled 3,000 cases.
The majority are not positive findings. The majority, maybe only a quarter or about 20 percent result in convictions. Of those, many people, there’s other evidence or the individuals committed other crimes. They — we found that there were 250 cases where cases were still questionable, that the forensic work was sent out for retesting.
One of our D.C. cases was a man who was never notified of this. We found that many of those 250 people were never notified. We’re looking at a universe potentially of scores, if not hundreds of candidates. How many might turn to be wrongfully convicted is anyone’s guess, but we have found again three in the District, 600,000 people. In a nation of 300 million, there are likely more.
JUDY WOODRUFF: Spencer Hsu, reporter, some fine reporting, Spencer, at The Washington Post. Thank you.
SPENCER HSU: Thank you.