More facial recognition technology reported in non-white areas of NYC: Amnesty International

According to a new study by the human rights organization Amnesty International, in areas of New York and areas with a higher concentration of non-whites, there were more CCTV cameras with the ability to identify persons.

“Our analysis shows that the use of NYPD face recognition technology is helping to strengthen discriminatory police against minority communities in New York,” said Matt Mahmoudi, an artificial intelligence and human rights researcher at Amnesty International, in a statement to ABC News.

“The shocking advances in face recognition technology in the city are forcing entire neighborhoods to be subjected to mass surveillance,” he added. “Now the NYPD needs to reveal exactly how this invasive technology is being used.”

Speaking about face recognition technology, New York City Police Deputy Commissioner John Miller told ABC News that victims of violent crime in the city are “mostly” colored people.

“They not only deserve, but also demand that the police respond to reports of crime and detain the perpetrators,” Miller said.

Amnesty International’s findings are based on crowdsourcing data from the NYC Decode Surveillance project, which mapped more than 25,500 CCTV cameras across New York City. Data were collected between 14 April 2021 and 25 June 2021.

The goal of the project was to find surveillance cameras in New York and find out where people will most often be tracked using Face Detection Technology (FRT). Amnesty International then collaborated with data researchers to compare these data with statistics on stoppage policy, issues and busting, and demographics.

The “Stop-and-Penalty” policy allows officers to stop, question and pat anyone who is considered suspicious.

The study found that in areas densely populated with CCTV cameras, police are at greater risk. Some people have criticized this police tactic as discriminatory. In 2019, 59% of those stopped by police as part of the shutdown and dismantling were black and 29% were Hispanic, according to the New York ACLU, citing NYPD data.

According to data collected by the US Census Bureau in July 2021, of those living in New York, 24.3% were black and 29.1% were Hispanic.

In a statement to ABC News, Miller said the cessation and risks had “decreased by more than 90% in eight years.”

“In numbers, the much smaller number of stops they still make is based on descriptions of people who have been victims of crime, who are most often members of the community where they stay,” he said.

Miller added that such stops contribute to the current level of gun arrests in New York City Police – “the highest levels in 25 years,” he said, “which is very important because the number of homicides has doubled and shots have doubled.” .

However, activists are concerned that invasive surveillance and face recognition technology threaten privacy and disproportionately harm and harm black and brown communities. Mahmoud called the spread of video surveillance a “digital stop and poll.”

The NYPD used the FRT in at least 22,000 cases between 2016 and 2019, Amnesty International said, according to STOP, a nonprofit surveillance organization was able to obtain from the NYPD through the city’s Freedom of Information Act.

“I’m not surprised that surveillance technology strikes, again, the same communities that have already been major targets of law enforcement, or in particular New York City police,” said Daniel Schwartz, privacy strategist and technology strategist at ACLU. ABC News.

“This is a very invasive harmful technology. It poses an unprecedented threat to privacy and civil liberties,” Schwartz said. “We call for a ban on this technology because we do not see how it can be used safely, given its great impact on civil rights and civil liberties.”

Criticism comes after New York City Mayor Eric Adams said he would expand the use of NYPD technology, including FRT.

“We will also move forward to use the latest technology to detect problems, track inquiries and gather evidence – from face recognition technology to new tools that can detect gun owners, we will use all available methods to ensure the safety of our people.” “- said Adams at a briefing in January.

Adams’ office did not respond to ABC News’s request for comment.

The NYPD has been using the FRT since 2011 to identify suspects whose images were “captured by cameras during robberies, thefts, assaults, shootings and other crimes,” according to the NYPD website. However, the agency says that “the coincidence of identification does not establish the probable reason for the arrest or a search warrant, but serves as a basis for additional investigative action.”

Robert Boyce, a retired chief of NYPD detectives, said the department has strict guidelines for using face recognition technology. He said no one is allowed to use the technology without a case number and approval from the supervisor.

“It’s a high bar to be able to use it, and it should be,” Boyce, who retired in 2018, told ABC News. “We are not using this for anything other than a criminal investigation, and have written a very tough policy in this regard because it has been under the scrutiny of many people.”

The quality of CCTV footage is often not good enough for police to use it to identify individuals, Boyce said, based on working time in the department. He said police were more likely to use social media accounts to find images of people they were looking for, rather than conducting an FRT search.

According to Boyce, images from social media accounts are often of better quality and therefore more useful for accurate results when using facial recognition software. Police are using the FRT as a way to help them find someone, but they still need a photo file or compilation to determine the subject matter so that it is permissible in court, he said.

“I can’t tell you how important this is. Our closing rates have risen significantly because we are doing it now,” Boyce told FRT. “I think it’s a huge help for us. But like everyone else, this can be abused and you have to stay on top of it.

“If I had to give him a figure, I would say they have grown by about 10%,” Boyce said of the department’s closure rates. Closing rates refer to the number of cases the department can handle.

Boyce argued that FRT should be adopted in more states and used more widely across the country with federal instructions for its use.

According to the U.S. Government Accountability Office, 18 of the 24 federal agencies surveyed reported using the FRT system in fiscal year 2020 for reasons including cybersecurity, domestic law enforcement and surveillance.

Along with the study, Amnesty International has also created a new interactive website that details the potential impact of FRT. Users can see what part of any walking route between the two locations in New York City may include face recognition observation.

Amnesty International said the Black Lives Matter protests in 2020 had a higher level of FRT exposure.

“When we looked at the routes people would take to get to the protests from and from nearby subway stations, we found almost complete surveillance coverage by state CCTV cameras, mostly NYPD Argus cameras,” Mahmoudi said.

“The use of mass surveillance technology in protest areas is used to identify, track and persecute people who are simply abusing their human rights,” Mahmoud said, calling it a “deliberate tactic of intimidation.”

He added: “Banning face recognition for mass surveillance is a very necessary first step towards eliminating racist police.”

The NYPD responded that it did not control where the protesters went.

“We did not choose the route taken by the demonstrators. We also could not control the route taken by the demonstrators, “Miller said in response to Amnesty International’s claims.

“There were no scanning demonstrations for face recognition,” Miller said.

“Face recognition tools are not connected to these cameras,” Miller said. “In cases where face recognition tools were used, it would be if there was an attack on a police officer or serious damage to property, whether it was a viable image that could be found against the pictures from the mug.”

The ACLU has also called for a ban on face recognition or biometric surveillance by the government against the population, Schwartz said.

“Any surveillance technology can have a blatant effect on how people participate and how they exercise their rights to freedom of speech. It’s scary to think about how protests can be monitored,” Schwartz said. “I think there need to be clear guardrails to use it.”

Miller, New York’s deputy police commissioner, said Amnesty International’s study does not tell the full story of how the FRT is used.

“Amnesty International has carefully selected selected data points and made allegations that, at best, are taken out of context and, at worst, deliberately misleading. In describing how the NYPD uses “artificial intelligence,” the report presents only artificial information, ”Miller said. told ABC News.

Last year, Amnesty International sued New York City Police after it refused to disclose public records of its acquisition of face recognition technology and other surveillance tools. The case continues.

.

Leave a Comment