By ANISHA BERMEJO
Several prominent social justice organizations agree: New York City Mayor Eric Adams’ “blueprint” to end gun violence is a way to “oversee and maintain social control of black bodies,” adding technology to the 2003 version of stop and frisk. With rising crime rates in New York City, Adams released his plan on Jan. 24, introducing artificial intelligence and other advanced technology to the New York City Police Department. The groups pushed back at a press conference on Feb. 3.
Adams’ plan’s details include giving the NYPD facial recognition technology and gunshot detection programs. An overall increase in mass surveillance, causing a whirlwind of disagreement by groups which include the Legal Aid Society, the National Lawyers Guild and Amnesty International.
As mentioned by Jason Williamson, executive director of the NYU Center of Race, Inequality, and the Law, black and brown communities would be affected disproportionately by this new technology as they have with the stop and frisk policy. That policy was ruled unconstitutional and race-based by a federal judge in 2013.
This AI technology has been proven to be faulty and racially biased according to Jerome Greco, supervising attorney at the Legal Aid Society’s Digital Forensics team. “Studies show that it [facial recognition] is less accurate on people of color, women, and young people and transgender people,” he said.
This tech would be added to programs police use such as bite mark evidence, something that teams at the Innocence Project showed was not unique the same way DNA and fingerprints are. The Innocence Project is a 30-year-old organization that exonerates wrongly convicted people.
“The NYPD’s claims about wanting to establish positive ties with the community they allege to serve are inconsistent with the use of faulty tech on people who live or work there,” Greco said. Just this past year Nijeer Parks, a 30-year-old black carpenter in New Jersey, was wrongfully arrested for assault charges because of the facial recognition software used by police.
Racial biases aside, the new technology would be considered a “violation of the right to privacy,” as said by Matt Mahmoud, Artificial Intelligences for Human Rights advisor at Amnesty International. The field of vision of controlled cameras used by the NYPD is estimated to be about one and a half to two blocks according to Mahmoud.”A protester walking a sample route from Washington Square Park subway to Washington Square Park proper is potentially exposed to facial recognition for 100% of their journey…surveilled by up to four NYPD cameras at a time,” he said.
This level of surveillance in impoverished communities, a majority made up of minorities, will only increase the level of suspicion between the public and law enforcement.
”Americans love a technological fix and they will take one bit of anecdotal info and run with it because it allows elected officials to avoid responsibility for creating the conditions that are producing violence in our community,” said Alex Vitale, a Brooklyn College sociology professor. “It is a set of political slogans designed to appeal to people’s worst fears.”
As an alternative to proposing this advanced technology, three participants at the Conference, Vitale, Christina Swarns of the Innocence Project and Erica Johnson of the National Lawyers Guild all suggested “community policing,” where officers would be working closely with community members to mend relations between damaged communities and the police. It is a proven solution that has been successful across the country. They expressed dissatisfaction that Adams’ plan to rely so much on new software was going in the wrong direction.