Forensic Tools: What’s Reliable and What’s Not-So-Scientific

For years, American TV shows have featured crime scene investigators using forensic evidence to solve grim murders. Often, however, these fictional CSIs present unrealistic portrayals of the capabilities of forensic science.
The reality is that not all forensic evidence is backed up by rigorous scientific research – meaning it doesn’t always point to the person who “did it.” A landmark 2009 study by the National Academy of Sciences (NAS) highlighted the tools that work – and those that fall short. Here’s a sampling of the basics:
DNA Analysis is the Gold Standard
In 1984, a British geneticist named Alec Jeffreys stumbled upon one of our most important forensic tools: DNA fingerprinting. Since his “eureka moment,” the scientific technique has been used successfully to identify perpetrators of a crime, clarify paternity and exonerate people wrongly convicted.
Today, the testing and analysis of DNA is considered the most reliable of all of the forensic tools. Unlike many of the others gathered to meet the needs of law enforcement, it faced rigorous scientific experimentation and validation prior to its use in forensic science.

“Among the biggest problems that we uncovered in the report is the absence of the application of scientific methodology to determine whether or not the discipline was valid and reliable as was done with DNA,” says Harry T. Edwards, a U.S. federal judge and part of the NAS committee that produced that 2009 report. “DNA is really the only discipline among the forensic disciplines that consistently produces results that you can rely on with a fair level of confidence, when you’re seeking to determine whether or not a piece of evidence is connected with a particular source.”
In fact, DNA has actually called into question the reliability of other forensic sciences, says Innocence Project co-founder Peter Neufeld.
“When we looked at all the cases of people who have been exonerated by DNA evidence, we found that in 60 percent of those cases, experts who testified for the prosecution produced either invalid evidence or the misapplication of science in their testimony.”
Fingerprints Can Lie
For more than a century, fingerprints, palm prints and sole prints have been used as identification tools by law enforcement. Collectively known as “friction ridge analysis,” this forensic method involves examiners comparing the details of an unknown print with a set or a database of known prints. These details include ridges, loops, whorls and other points of similarities.
Criminologists and law enforcement officials long swore that fingerprint identification was infallible and that it was possible for an examiner to determine that a print comes from a single unambiguous source. If an examiner has a whole, perfect print, they argued, identifications can be made with reliability.
But recent errors have fueled a debate about the reliability of fingerprint forensic evidence, the most prominent being the case of Oregon lawyer Brandon Mayfield. After the March 2004 terrorist bombings in Madrid that killed almost 200 people, a partial print found on a bag of detonators was sent to the FBI. An examiner determined that the print belonged to Mayfield, who was later detained. In total, four fingerprint examiners – including one hired by Mayfield’s defense team – declared that his print matched the partial from Spain.

Except there was a problem: the print wasn’t Mayfield’s at all. Spanish officials matched the partial print to an Algerian man named Daoud Ouhnane. Mayfield later sued the government, which settled for $2 million.
“I knew that our profession had taken some sort of a quantum leap because suddenly there were new rules involved,” veteran fingerprint examiner Ken Moses told FRONTLINE. Moses was one of the four people who incorrectly matched Mayfield to the latent print.
According to the National Academies of Sciences, no peer reviewed scientific studies have ever been done to prove the basic assumption that every person’s fingerprint is unique. Recent studies have also shown that fingerprint examiners can be influenced by contextual bias when comparing fingerprints.
“Prior to Mayfield, there were some people in the fingerprint community who really were saying that something like Mayfield could never happen,” says Jennifer Mnookin, a UCLA law professor who is leading a federally-funded study of error rates in fingerprint comparisons. “And so part of the problem here really is about hubris or over-claiming. It’s about a field that didn’t seem to feel a need to recognize its limits.”
FBI fingerprint expert Melissa Gische told FRONTLINE that, as a result of cases like Mayfield’s, she would no longer testify to a zero error rate for fingerprints in court.
Sometimes Bite Marks Bite Back
One of the most controversial forensic techniques is bite-mark comparison. Bite marks can change over time and be distorted due to factors like swelling and healing. Similar to fingerprint analysis, the assumption behind bite mark comparisons that every person’s dental characteristics are unique has not been adequately scientifically studied or scrutinized.

Attorneys for the Innocence Project say that the scarcity of research backing up bite mark comparisons played a role in a number of wrongful convictions over the years.
“There have been a number of people who were convicted based on bite-mark testimony who were sent to death row or sent to prison for life,” says Neufeld, who represented both Levon Brooks and Kennedy Brewer, both wrongly convicted based in large part on faulty bite mark testimony. “And in each of those cases, a whole group of forensic odontologists, forensic dentists said they were absolutely certain that this was the guy and they were absolutely wrong.”
Firearms, Bullets and Ballistic Identification
When shots are fired in the commission of crime, a forensic expert is often brought in to study the trajectory of the bullets; later, he or she may testify in court as to the direction from which a bullet came and the firearm used in the crime.

The National Academies of Sciences recognized the logic involved in trying to compare firearms-related marks by noting, “although they are subject to numerous sources of variability, firearms-related tool marks are not completely random and volatile; one can find similar marks on bullets and cartridge cases from the same gun.” But the NAS also observed, that “the validity of the fundamental assumptions of uniqueness and reproducibility of firearms-related tool marks has not yet been fully demonstrated.”
The report added that, “a significant amount of research would be needed to scientifically determine the degree to which firearms-related toolmarks are unique or even to quantitatively characterize the probability of uniqueness.”
One fundamental problem with firearms analysis is the lack of a precisely defined process, the NAS found. An examiner may offer an opinion that a specific tool or firearm was the source of a specific mark when “sufficient agreement” exists in the pattern of two sets of marks, but there is no precise definition for that statement. The NAS also found there have been no scientific studies to answer questions regarding variability, reliability, repeatability, or the number of correlations needed to achieve a given degree of confidence.
Matching Hair is Not as Simple as Splitting Hairs
For years, forensic hair examiners have testified that physical characteristics of hairs can be identified and used to establish the presence, or absence, of certain people at a crime scene.
The problem? No scientifically-accepted statistics exist about the frequency with which particular characteristics of hair are distributed in the population, according to the NAS. And there appears to be no uniform standards on the number of matching features that must be present for an examiner to declare a match.
Recent studies reveal that microscopic hair analysis is not yet a precise science. One study by the FBI, cited in the NAS report, found that of 80 hair comparisons done through microscopic examinations, 9 of them, about 12.5 percent, were found to come from different sources when reexamined through DNA analysis.
For more, read this Washington Post investigation on how problematic hair and fiber analysis may have led to wrongful convictions.
Where There’s Smoke There’s Not Necessarily Fire
When a fire occurs, fire investigators are called to the scene to determine the cause of the fire and the potential for arson.
But according to the National Academies of Sciences, a lot more research is needed on the natural variability of burn patterns and damage characteristics and how they are affected by the presence of various accelerants. In fact, as FRONTLINE reported in the 2010 film Death by Fire, many of the supposed telltale signs of arson – the remnants of accelerant pour patterns, for example – can actually be caused by natural phenomena during accidental fires.
“The fire investigation community largely consists of people who are firemen. They’re not scientists,” arson expert John Lentini told FRONTLINE. “Extinguishing a fire and investigating a fire involve two different skill sets and two different mindsets.”
Another scientific expert, Gerald Hurst, offered a startling “devil’s advocate” opinion about the state of arson testimony in the courtroom: ” I could take almost any fire and — if I were so inclined — convince a jury that it was arson. It’s frighteningly simple, frighteningly easy.”
Lentini’s website Scientific Fire Analysis hosts his breakthrough publications on arson science, including “The Lime Street Fire: Another Perspective,” [PDF] which contests traditional understandings of burn patterns through a series of tests.
Watch this excerpt from Death by Fire for more on the emergence of new fire science — and how it called into question the guilt of Cameron Todd Willingham, who was executed in 2004 after being found guilty of the arson-murder of his three young children:
There is Solid Science Behind Drug Testing
Also known as forensic toxicology, the analysis of controlled substances involves the collection of chemicals that have the legally recognized potential for abuse. They include “street drugs” such as heroin and ecstasy, and prescription drugs like oxycodone.
Drug testing is the most frequent forensic function performed by publicly funded crime laboratories, which analyze biological samples for the presence of toxins present in an individual to determine whether the amount of those substances is above a harmful level. It is used to make inferences on an individual’s death, illness, and mental or physical impairment. Like DNA analysis, the analysis of controlled substances is a mature forensic science discipline and one of the areas with strong scientific underpinnings developed along the lines of classical analytical chemistry.
The NAS report found that there exists an adequate understanding of the uncertainties and potential errors in the analysis of controlled substances due to rigorous scientific testing.