Type to search

HEALTH/SCI/TECH NATIONAL POLICE/PRISON

New Facial Recognition Software Predicts You’re a Criminal Based on your Face

NIST computer scientist Ross Micheals demonstrates studying the performance of facial recognition software programs. (Courtesy of Wikimedia Commons)
NIST computer scientist Ross Micheals demonstrates studying the performance of facial recognition software programs. (Courtesy of Wikimedia Commons)

The latest technological development appears to be an updated, “algorithmic phrenology,” repackaging a dangerous idea for the 21st century, all the more noteworthy because they are trying to sell it to law enforcement as an unbiased tool helping society.

(By: Alan Macleod, Mintpress News) A team from the University of Harrisburg, PA, has developed automated computer facial recognition software that they claim can predict with 80 percent accuracy and “no racial bias” whether a person is likely going to be a criminal, purely by looking at a picture of them. “By automating the identification of potential threats without bias, our aim is to produce tools for crime prevention, law enforcement, and military applications,” they said, declaring that they were looking for “strategic partners” to work with to implement their product.

In a worrying use of words, the team in their own press release, move from referring to those the software recognizes as being “likely criminals” to “criminals” in the space of just one sentence, suggesting they are confident in the discredited racist pseudoscience of phrenology they appear to have updated for the 21st century.

Public reaction to the project was less than enthusiastic, judging by comments left on Facebook, which included “Societies have been trying to push the idea of ‘born criminals’ for centuries,” “and this isn’t profiling because……?” and “20 percent getting tailed by police constantly because they have the ‘crime face.’” Indeed, the response was so negative that the university pulled the press release from the internet. However, it is still visible using the Internet Wayback Machine.

While the research team claims to be removing bias and racism from decision making, leaving it up to a faceless algorithm, those who write the code, and those who get to decide who constitutes a criminal in the first place, certainly do have their own biases. Why are the homeless or people of color who “loiter” on sidewalks criminalized, but senators and congresspersons who vote for wars and regime change operations not? And who is more likely to be arrested? Wall Street executives doing cocaine in their offices or working-class people smoking marijuana or crack? The higher the level of a person in society, the more serious and harmful their crimes become, but the likelihood of an arrest and a custodial sentence decreases. Black people are more likely to be arrested for the same crime as white people and are sentenced to longer stays in prison, too. Furthermore, facial recognition software is notorious for being unable to tell people of color apart, raising further concerns.

Crime figures are greatly swayed by whom the police choose to follow and what they decide to prioritize. For example, a recent study found 97.5 percent of Brooklyn residents arrested for breaking social distancing laws were people of color. Meanwhile, an analysis of 95 million traffic stops found that police officers were far more likely to stop black people during the daytime when their race could be determined from afar. As soon as dusk hit, the disparity greatly diminished, as a “veil of darkness” saved them from undue harassment, according to researchers. Thus, the population of people convicted of crimes does not necessarily correspond to the population that commits them.

The 2002 hit movie Minority Report is set in a future world where the government’s pre-crime division stops all murders well before they happen, with future criminals locked up preemptively. Even if accurate, is an 80 percent accuracy rate worth risking the creation of a dystopian Minority Report-style society where people are monitored and arrested for pre-crimes?

Phrenology, the long-abandoned study of the size and shape of the head, has a long and sordid history of dangerous racist and elitist pseudoscience. For instance, Cesare Lombroso’s 1876 book, Criminal Man, told students that large jaws and high cheekbones were a feature of “criminals, savages and apes,” and was a sure sign of the “love of orgies and the irresistible craving for evil for its own sake, the desire not only to extinguish life in the victim, but to mutilate the corpse, tear its flesh, and drink its blood.” Meanwhile, rapists nearly always have jug ears, delicate features, swollen lips, and hunchbacks. Lombroso himself was a professor in psychiatry and criminal anthropology and his book were taught in universities for decades. To Lombroso, it was almost impossible for a good-looking person to commit a serious crime.

The latest technological development from the University of Harrisburg appears to be an updated, “algorithmic phrenology,” repackaging a dangerous idea for the 21st century, all the more noteworthy because they are trying to sell it to law enforcement as an unbiased tool helping society.

Alan MacLeod is a Staff Writer for MintPress News. After completing his PhD in 2017 he published two books: Bad News From Venezuela: Twenty Years of Fake News and Misreporting and Propaganda in the Information Age: Still Manufacturing Consent. He has also contributed to Fairness and Accuracy in ReportingThe GuardianSalonThe GrayzoneJacobin MagazineCommon Dreams the American Herald Tribune and The Canary.

Tags:
Guest Post

Citizen Truth republishes articles with permission from a variety of news sites, advocacy organizations and watchdog groups. We choose articles we think will be informative and of interest to our readers. Chosen articles sometimes contain a mixture of opinion and news, any such opinions are those of the authors and do not reflect the views of Citizen Truth.

You Might also Like

Leave a Comment

Your email address will not be published. Required fields are marked *