How fascinating. It wasn't until 1910. Yet we feel as if fingerprint evidence has been around for much longer than that.
Just after 2 a.m. on the night of September 19, 1910, Clarence Hiller woke to the screams of his wife and daughter in their home at 1837 West 104th Street in Chicago. After a spate of robberies, residents of this South Side neighborhood were already on edge. Hiller, a railroad clerk, raced to confront the intruder. In the ensuing scuffle, the two men fell down the staircase. His daughter, Clarice, later recalled hearing three shots, followed by her mother screaming upstairs. Neighbors came running but the man had fled the home, leaving a dying Hiller by his front door.
The unknown assailant didn’t make it far. Thomas Jennings – an African-American man who had been paroled six weeks earlier - was stopped a half-mile away wearing a torn and bloodied coat and carrying a revolver. But it was what he left behind that would be the focal point of his trial—a fingerprint from a freshly painted railing that he used to hoist himself through a window at the Hiller house. Police photographed and cut off the railing itself, claiming it would prove the identity of the burglar. In the eyes of the court, they were right; Hiller’s murder would lead to the first conviction using fingerprint evidence in a criminal trial in the United States. At times controversial, this method of solving cases endures more than a century later.
Not only has fingerprinting had staying power in the legal system, the underlying method is fundamentally the same as when it was first introduced to American police departments. Prints are still evaluated based on the same descriptions of arches, loops and whorls written by Sir Francis Galton in the late 19th century. Further, the basic technique of collecting and comparing remains remarkably similar to what was applied to that rudimentary set of prints discovered at the Hiller home.
Jennings’ defense attorneys raised questions about this new—and little understood—technique, as well as whether such evidence could even be legally introduced in court (the first time it was used in Britain, they claimed, a special law was needed to make such evidence legal). The defense team even solicited prints from the public in an effort to find a match and disprove the theory that fingerprints were never repeated. A courtroom demonstration, however, backfired badly: Defense attorney W.G Anderson’s print was clearly visible after he challenged experts to lift the impression from a piece of paper that he had touched.
This made a distinct impression on the jury as well; they voted unanimously to convict Jennings, who was sentenced to hang. The Decatur Herald called it “the first conviction on finger-printing evidence in the history of this country,” adding with dramatic flourish that “the murderer of Hiller wrote his signature when he rested his hand upon the freshly painted railing at the Hiller home.”
It’s unclear the degree to which Jennings’s race played a part in his trial. News reports at the time didn’t sensationalize race in their coverage, or even mention Hiller’s race. Yet it’s not hard to envision that a jury, presented with an unfamiliar technique, would have been more skeptical with a white defendant.
The concept of identifying people by unique fingerprints, first laid out 18 years earlier in Europe, even had its origin in pseudoscientific racial beliefs. It was thoroughly studied and chronicled in Galton’s 1892 epic tome Finger Prints (A cousin of Darwin, Galton had long focused on a series of experiments hoping to tie myriad personal and intellectual characteristics to physical traits and heredity). Galton, who had also studied anthropometry in an effort to deduce the meaning behind physical measurements, did not find any major difference between races in his exhaustive collection of prints for research—but not for lack of effort. He wrote in Finger Prints that “it seemed reasonable to expect to find racial differences in finger marks, the inquiries were continued in varied ways until hard fact had made hope no longer justifiable.”
As journalist Ava Kofman
recently outlined in the Public Domain Review, Galton’s pursuit of fingerprint science meshed well with colonialist ideology of the time. “Fingerprints were originally introduced for Europeans to distinguish between the otherwise indistinguishable mass of extra-European peoples, who themselves produced “indecipherable” fingerprints,” she wrote. Later in his career, according to Kofman, Galton would later engage in quantifying racial differences, inventing “scientific,” numerical measurements to categorize humans by race.
Nonetheless the system Galton outlined was to identify unique characteristics proved effective and caught on quickly. Police in the United States were just beginning to emulate their European colleagues and started to gather prints for the purpose of identification in the early 20th century. During the 1904 World’s Fair in St. Louis, Scotland Yard sent representatives to host an exhibit to demonstrate the technique, which was growing in popularity in British courts. Even Mark Twain was caught up in the speculation of how they could be used to apprehend criminals, placing “the assassin’s natal autograph” – which is to say the “blood-stained finger-prints” found on a knife- at the center of the dramatic courtroom finale in his novel Puddn’head Wilson, published years before the Jennings case.
After Jennings’ conviction, however, lawyers mounted a challenge to the notion that such a newfangled and little-understood technique could be admitted in court. After more than a year in the appeals process, on December 21, 1911, the Illinois Supreme Court upheld the conviction in the People v. Jennings, affirming his sentence would be carried out soon after. They cited prior cases in Britain and published studies on the subject to lend credibility to fingerprinting. Several witnesses in the Jennings trial, it pointed out, had been trained by the venerable Scotland Yard. “This method of identification is in such general and common use that the courts cannot refuse to take judicial cognizance of it,” the ruling stated.
Fingerprinting had thereby been “proclaimed by the Supreme Court of Illinois to be sufficient basis for a verdict of death by hanging,” the
Chicago Tribune reported, and it was the beginning of a shift toward the largely unquestioned use of fingerprint evidence in courtrooms across the United States. “The Jennings case really is the earliest case – earliest published case – in which you’ll find any discussion of fingerprint evidence,” says Simon A. Cole, author of
Suspect Identities: A History of Fingerprinting and Criminal Identification and professor of criminology, law and society at the University of California, Irvine School of Social Ecology. “So, in that sense it really is a precedent for the whole country.”
People v. Jennings further specified that fingerprint evidence was something that the average juror would have to rely on interpretation to understand. “Expert testimony is admissible when the subject matter of the inquiry is of such a character that only persons of skill and experience are capable of forming a correct judgment as to any facts connected therewith.” The inclusion of this statement was crucial in legal terms: some level of human judgment and interpretation was a given, built into the courtroom process when fingerprint evidence was presented to a jury. The degree of subjectivity that represents and what potential room for error - however small – is acceptable is still actively debated more than a century later.
Beginning with the Jennings trial, two fundamental questions have formed the basis of any challenge to its admissibility in court. Is the technique itself sound (the primary issue when it was first introduced)? And how accurate the evidence is when interpreted and applied to any specific case? “The uniqueness of fingerprints is really kind of beside the point of the accuracy of the identification,” says Cole. “The best way to understand that is to think about eyewitness identification – nobody disputes that all human faces are in some sense unique, even those of identical twins, but nobody reasons from that that eyewitness identification must be 100 percent accurate.” Juries like the one that convicted Jennings were initially focused on whether prints were repeated, “whereas really what we need to know is can people match them accurately.”
It is this gray area that defense attorneys seize on in thorny legal cases. Following a 1993 Supreme Court ruling in Daubert vs. Merrell Dow Pharmaceuticals Inc., judges were required to apply what is known as the
Daubert standard to determine if a witness’ testimony can be considered scientific. This is based on a list of factors, including how the technique itself has been tested, error rates and what regulations govern its usage. These standards were more stringent than what had previously been required, putting the onus on judges to determine what could be considered by a jury as scientific evidence.