Anna Fitzpatrick — July 23, 2013 @ 3:33 PM — Comments (2)
A review of more than 21,000 cases has revealed twenty-seven death penalty cases in which the FBI’s forensic experts may have exaggerated the scientific conclusions that could be drawn from a so-called “match” between a hair found at the scene of a crime and hair of the defendant. It is not known how many cases involve errors, how many led to wrongful convictions, or how many mistakes may now jeopardize valid convictions. Those questions will be explored as the review continues.
The discovery of the more than two dozen capital cases promises that the examination could become a factor in the debate over the death penalty. Some opponents have long held that the execution of an innocent person would solidify doubts about capital punishment. But if DNA or other testing confirms convictions, it would strengthen supporters’ arguments that the system works.
At least three Florida men, including DNA exonerees Wilton Dedge and William Dillon, were convicted based on, among other things, testimony provided by John Preston, who claimed that his dogs could perform feats of forensic detection far beyond the abilities of other investigative dogs. Preston testified in each case that his dog picked up the scent of the defendant at the scene of the crime, testimony that all but sealed their fate. By now, though, his claims have been thoroughly discredited by experts in the field of scent tracking, media reports, multiple state supreme courts, police training manuals, and law review articles. This leads to the question of why Preston was ever considered reliable in the first place and why more was not done to do a re-review of all cases in which Preston and fraudulent dog handlers like him have testified.
Advocates for defendants and the wrongly convicted called the FBI’s reexamination of possibly faulty forensic conclusions a watershed moment in police and prosecutorial agencies’ willingness to re-open old cases because of scientific errors uncovered by DNA testing. “We didn’t do this to be a model for anyone,” said FBI general counsel Andrew Weissman. “When there’s a problem, you have to face it, and you have to figure how to fix it, move forward and make sure it doesn’t happen again.” FBI Laboratory director Chris Hassell has said that the review will be used to improve lab training, testimony, audit systems, and research.
The review is a huge step forward to improving the criminal justice system and the rigor of forensic science in the United States. Faulty forensics and science is one of the leading causes of wrongful convictions, and a revised approach to forensics could help to reduce that number of miscarriages of justices before the occur. Hopefully we’re not far off from reforms in other leading causes, such as eyewitness identification or snitch testimonies.
Innocence Project of Florida,policy,Science, death penalty, DNA, FBI, forensics, John Preston
Susan — September 16, 2011 @ 4:06 PM — Comments (1)
Rick Perry, the Willingham case and the death penalty. Cameron Todd Willingham was convicted of murder and executed in Texas in February 2004. It was a grisly crime; Willingham’s three daughters died in what was supposedly a fire started by arson in 1991. Rick Perry, the man who would be President, was the governor then as he is now. Although Perry has signed 234 death warrants (more than any other governor), this case stands out. It seems that many believe Willingham should not have been convicted, let alone executed, due to less than perfect arson scientific analysis – what some call “junk science”.
The Texas Forensic Science Commission was set to investigate the matter when Perry intervened two days before the body would hear expert testimony criticizing the handling of Willingham’s case. Not only did Perry replace Chairman Sam Bassett and the other three members, the Texas Attorney General has since limited the commission’s authority through a July ruling.
“At first, when I was replaced, I gave the governor the benefit of the doubt. But now that time has passed, I’ve seen this kind of endless drumbeat of strategies and actions to stop this investigation, and it’s been terribly disappointing,” said Bassett.
Perry and his campaign aides deny Bassett’s accusations claiming that Willingham was a “monster” who murdered his three children. Read more about this in Matt Smith’s and Ed Lavandera’s article at CNN Politics. Read more about Rick Perry and his embrace of the death penalty at The Crime Report (Bill Boyarsky).
The reconstituted Commission is set to meet next month to further examine the Willingham case and to begin a review of other cases that contain arson evidence. See Brandi Grissom’s article in The Texas Tribune. The Innocence Project asked commissioners to evaluate the actions of the Texas Fire Marshal’s Office. Stay tuned and remember – only in Texas!
The State of Georgia vs. Troy Davis. Troy Davis is due to be executed in Georgia on September 21. The case has garnered nationwide attention because of allegations that the case against Davis is, at best, seriously flawed. Not only have seven of nine eyewitnesses recanted, but no physical evidence exists to tie the convicted man to the case, although there is evidence of another perpetrator. Davis has served 20 years on death row for the 1989 murder of a Savannah police officer and continues to declare his innocence.
Three thousand religious leaders from the 50 states have asked the Georgia Board of Pardons and Paroles to halt the execution to investigate this case further. Learn how you can help at Forbes (E. D. Kain, Contributor). The Board will meet on Monday to decide Davis’ fate and The Innocence Project encourages those interested to respond. We will keep you updated. Learn more by reading Emily Hauser’s piece in The Atlantic.
High Court halts another Texas execution.
MSNBC is reporting that the United States Supreme Court on September 15 halted the execution of Duane Buck by the State of Texas. There apparently is no question as to Buck’s guilt – he committed the double murder of his former girlfriend and her companion 16 years ago. He was arrested at the scene of the crime in an agitated state and there were several reliable eyewitnesses including his two children. The problem with this case is the validity of the sentence.
Buck’s attorneys allege that his case was “tainted by consideration of race” when a psychologist publicly testified that black criminals (Buck is black) were more likely to recommit violent acts in the future. The jury must consider the likelihood that the accused will be a continuing threat to society during its sentencing deliberations. We will keep you abreast of new developments.
judicial,justice,Science, Cameron Todd Willingham, death penalty, Duane Buck, eyewitness misidentification, forensics, Governor Rick Perry, junk science, justice, Sam Bassett, Supreme Court of the United States, Texas, Troy Davis
Lenore — November 17, 2009 @ 12:29 PM — Comments (1)
That was the title of the article from the Yale Daily News yesterday about the 1998 murder of a Yale senior, Suzanne Jovin.
On the night of Friday December 4, Jovin had turned in her senior essay, volunteered at a Best Buddies event, and went home to e-mail a friend before heading out to return the keys of a university car she had borrowed that night. While walking on campus, she encountered two classmates around 9:30pm. At 9:55, a 911 call was made reporting they saw a woman bleeding on a corner in a nice neighborhood a couple of miles from campus. The woman was Jovin, who had been stabbed 17 times in her head and neck.
Evidence included DNA scraped from under her fingernails, a Fresca bottle found at the scene containing a handprint, and a cigarette also found at the scene. Unfortunately, the DNA sample was contaminated by former lab technician Kiti Settachatgul, who was working at the forensic laboratory in charge of testing the evidence. Recently testing confirmed Settachatgul’s DNA in the fingernail scraping sample, making the testing unusable in court. Now, a decade later, the investigation is moving on to test the other evidence found a the crime scene – particularly the print on the soda bottle.
Since the original investigation, one of the prime suspects was James Van de Velde, Suzanne Jovin’s senior thesis advisor. Immediately after being publicly named as a suspect, Van de Velde’s classes were canceled and he discontinued teaching at Yale. During the DNA testing no evidence was found linking Van de Velde to the crime, clearing him of the crime. Hopefully the other evidence can help to finally solve this case.
I think an important message from this case is the need for improvement within forensic laboratories. This whole case would have been solved had the sample not been contaminated. Extensive measures should be taken to prevent such occurrences. All the lab technicians should undergo specialized training for working within those crime labs and should have background checks run to ensure their reliability. It should also be made very sure that they don’t have any personal involvement within the case, or don’t know what or who they’re actually testing for. As outlined in our “Solving the Problem: Evidence Reservation” page, evidence can help to solve cases long after they’ve occurred. That’s why it’s so important to make sure those samples last, whether it’s guaranteeing no contamination, storing properly, or even holding onto it at all, we need to make sure everything from a case is saved in case we need to test or retest it in the future.
justice, crime labs, forensic science, forensics
Lenore — July 21, 2009 @ 2:37 PM — Comments (1)
Lately we’ve been discussing a lot of junk science in our blog. Discrediting the validity of junk sciences seems to be a recent trend.
An opinion editorial on dallasnews.com says forensic investigations need to weed out the junk science procedures. It describes the case of Michael Blair, who spent 14 years on death row for murder due to unreliable hair analysis.
Hair-fiber analysis involves taking a hair under a microscope and comparing it to another (i.e. the hair of a suspect compared to a hair found at the crime scene). However, since the start of this procedure there has been no set criteria in association nor any proof that the method works. A New York Times article from 2001 shared information on a study implemented on hair analysis. This statement grabbed my attention:
In the early 1970′s, the federal government sponsored a proficiency-testing program for 240 laboratories. The labs did so poorly on hair analysis that flipping a coin would have saved a great deal of time, at no cost to accuracy.
Based on the results, you’d think they’d have discontinued the use of the procedure three decades ago. Still, it was in use in 2001 when the Journal of the American Medical Association ran another study that came to the same conclusion. While today the technique is not completely disestablished, it is unfavorable and even the FBI has rejected it, replacing the method with the reputable hair DNA testing (which uses DNA from the hair as opposed to visual comparison).
Microscopic hair analysis and Comparative Lead Bullet Analysis (CBLA) are both junk sciences that have been nearly obliterated, with dog scent identification on its way out. Along with the study on fingerprint analysis, the faults of forensics are being exposed and replaced with new and more reliable methods. Hopefully in the near future we will have a forensic system that we can trust in.
Science, forensic science, forensics, junk science
Lenore — July 20, 2009 @ 1:48 PM — Comments (5)
In today’s fingerprint world, scanners, not ink, collect prints, and gigantic automated databases — like the FBI’s Integrated Automated Fingerprint Identification System — spit out the closest matches in seconds. IAFIS has the prints of more than 55 million subjects in its Criminal Master File.
Still, television’s CSI this is not. Despite all that technology, it then falls to fallible human beings to step in and make visual comparisons and the ultimate judgment calls on matches
Grits for breakfast published an article over the weekend pointing my attention to a story on the affect of unintentional bias on fingerprint analysis.
I had already considered fingerprint analysis a junk science. Human examination is the key method to matching fingerprints and humans make errors, as previous cases and the article demonstrate.
A experiment was done by cognitive neuroscientist Itiel Dror and fingerprint examiner Dave Charlton in which five international fingerprint analysts were given the same fingerprint twice, but receiving different information about the case each time. The cases were slipped into their work and the examiners did not know when a test fingerprint would appear. Mostly all of the test subjects changed their minds.
In Dror’s study with Charlton, the five experts were told they were seeing the erroneously matched fingerprints of Brandon Mayfield, the Oregon man once linked to 2004′s terrorist train bombings in Madrid. That he’d been wrongly accused based on fingerprint misidentification was widely known, thus suggesting what an expert determination would be.
What they were actually seeing, however, were fingerprints from other cases that they’d made determinations about years before.
Dror’s study proves what we already knew – that human’s are inconsistent and capable of making mistakes.
The article also discusses several cases in which fingerprint analysis led to wrongful accusation or conviction. In the case of Shirley McKie, four fingerprint experts claimed that a fingerprint found at a murder crime scene was hers and she was arrested, but later acquitted when two other experts determined the print wasn’t hers. Brandon Mayfield spent two weeks in prison for the 2004 terrorist train bombings after FBI agents determined that fingerprints found were his, later to be found false.
We need to be careful of the investigation techniques we believe in. In this and many other procedures, we trust without question simply because they’ve been used for long periods of time. We always forget that mistakes can be made.
Science, fingerprints, forensic science, forensics, junk science
Ryan — April 30, 2009 @ 10:28 AM — Comments (0)
An article in the Houston Chronicle a few days ago told the story of Gary Alvin Richard, who was convicted of a rape and robbery in 1987. Richard has spent 22 years behind bars for what is now clearly a crime he did not commit. New blood-typing tests and recently-unearthed (withheld evidence, in this case) prosecutorial misconduct solidify that conclusion.
Both sides are asking a judge to overturn his conviction.
A jury convicted Gary Alvin Richard in a 1987 attack on a nursing student in a trial based largely on blood-typing evidence from the Houston Police Department crime lab. But, prosecutors and the defense attorney agree, new tests completed Friday show that an [Houston Police Department] analyst misled jurors at Richard’s trial and failed to report evidence that may have helped him.
Based on the new tests, both sides will ask a judge next week to release Richard on bond while they sort out what happened in his case…
Richard’s case abounds with issues common to wrongful convictions. Among them:
The victim identified him some seven months after the attack. HPD crime lab analysts came to conflicting conclusions about the evidence, but reported only the results favorable to the case. Physical evidence collected in what is known as a “rape kit” has been destroyed, a victim of poor evidence preservation practices, leaving nothing for DNA testing now.
Richard’s case is of many that have come to light since the Houston Police Department initiated a review of past cases in October of 2007. That review was spurred “days after DNA evidence cleared Ronald Taylor of sexual assault in a case where HPD analysts performed faulty tests on body-fluid evidence.” Kudos to Houston for reviewing its past cases with some genuinely desire for justice, but this episode also serves as a reminder of the importance of getting things right the first time.
Visit IPF’s Website by clicking here; sign up to volunteer by clicking here; contribute to our work by clicking here.
exoneration, crime labs, destruction of evidence, evidence, forensics, prosecutorial misconduct
Ryan — April 23, 2009 @ 10:57 AM — Comments (0)
Per the Innocence Project’s blog today, “A bill passed by the Texas Senate this week would provide an avenue for prisoners to challenge convictions based on discredited forensic science.” They reference a story in the Marshall News Messenger that begins,
Criminals who were sent to prison — or sentenced to death — based on discredited scientific evidence would be given a new way to challenge their convictions under a bill passed this morning by the Texas Senate.
In recent years, an increasing number of arson and gunshot convictions in Texas have triggered alarm as new technology proved earlier evidence wrong, and convictions were cast into doubt — including at least one case in which the prisoner was executed.
The measure by state Sen. John Whitmire, D-Houston, would allow discredited scientific evidence that figured in a criminal conviction to be considered by an appeals court in order to establish the innocence of a defendant…
Advancements in forensic testing — DNA, ballistics and arson — have led to new evidence being uncovered in several cases in Texas. Whitmire said that led him to file the bill, which clarifies how discredited scientific evidence can be used in court appeals.
In December 2008, we worked to overturn Jimmy Ates’ conviction, based largely on fraudulent FBI bullet lead analysis. Since then, the National Academy of Sciences has issued a scathing report, decrying the sorry state of forensic science labs around the country.
We know well how junk science can contribute to a wrongful conviction – indeed, the Innocence Project in New York says that junk science contributed to over half of the nation’s first 225 DNA exonerations. We applaud steps like these being taken in Texas, and hope that a new incredulity toward and accountability regarding forensic science will spread to other states and jurisdictions.
post-conviction, arson, bullet-lead analysis, CBLA, crime labs, death penalty, DNA, forensics, Jimmy Ates, junk science, legislation, National Academy of Sciences, post-conviction, wrongful conviction
Ryan — April 14, 2009 @ 10:02 AM — Comments (0)
The Innocence Project in New York recently released a report titled, “Investigating Forensic Problems in the United States: How the Government Can Strengthen Oversight through the Coverdell Grant Program.” From the executive summary:
In 2004, Congress established an oversight mechanism within the Paul Coverdell Forensic Science Improvement Grant Program, which provides federal funds to help improve the quality and efﬁciency of state and local crime labs and other forensic facilities.
[...] Nearly ﬁve years after Congress passed legislation to help ensure that forensic negligence or misconduct is properly investigated, extensive independent reviews show that the law is largely being ignored and, as a result, serious problems in crime labs and other forensic facilities have not been remedied. In short, the U.S. Department of Justice’s Ofﬁce of Justice Programs (OJP), which is responsible for the program, has failed to make sure that even the law’s most basic requirements are followed.
Yesterday, the blog for TheHill.com paraphrased some of the results of this study, and reiterated the Innocence Project’s call for increased oversight or, rather, they called for the Obama administration to increasingly take advantage of the grant program that Congress created five years ago. One particularly egregious fact they quote is this: only 13% of designated oversight entities meet the federal law’s forensic oversight requirements. If you were a defendant, would you want to take a 1-in-8 chance that the forensic lab that processed the evidence in your trial was subject to proper oversight?
Finally, “Under new leadership, the Department of Justice can – and should – make sure crime lab problems are properly addressed, which will enhance the public safety and help prevent wrongful convictions.” Remember, working to correct problems in order to preclude wrongful convictions is cheaper than housing wrongfully incarcerated individuals.
No sooner had The Hill run this post than Grits for Breakfast published some presentations from the public meetings held by the National Academy of Sciences, meetings held to address the problems plaguing forensic science labs around the country. They link to this presentation in particular that calls for forensic tests “to be as blind as possible, for as long as possible,” and which contains the shocking graphic on common error rates linked above.
You’ll notice that firearms and fingerprints, while among the most reliable forensic testing methods, still yield erroneous conclusions around 1-5% of the time. Some toolmark and bitemark tests, meanwhile, are reliable less than half of the time. That report also refers to several studies that found that, for example, when a scientist was provided with “context” for certain samples – context such as, “The suspect has already confessed, here’s his hair and a hair from the crime scene” – that error rates were much higher. Those who conducted the psychological studies could induce false positives by giving false context, leading the forensic scientist to believe certain conclusions before they came to them independently.
All of these scientific studies point to the sad state of the crime labs in this country. Scientists might think so, but they are not immune to psychological tendencies – such as suggestibility – that afflict every human being. Independent oversight and common-sense reforms are the necessary solution to the problem.
Visit IPF’s Website by clicking here; sign up to volunteer by clicking here; contribute to our work by clicking here.