Panel advances bill to restrict the use of facial recognition technology in Colorado | Legislature

On Wednesday, the Senate board unanimously passed a bill that would restrict the use of face recognition technology in Colorado government, law enforcement, and schools.

If passed, Senate Bill 113 will lay down several restrictions and rules for the use of artificial intelligence services to identify individuals by state and state law enforcement. The bill also completely bans the use of face recognition technology in public and charter schools until 2025.

The sponsor of the bill, Sen. Chris Hansen, of Denver, said the law aims to slow down and re-evaluate the state’s use of face recognition technology because of the disproportionate identification of colored people.

“Sure, there are a lot of great things that this technology can and does do for us … but there are also some real drawbacks,” Hansen said. “This is not a bill that is designed to halt progress or reduce the proper use of this technology, but we need to think carefully about how we apply it, especially in public and with law enforcement.”

Numerous studies have found racial bias in facial recognition technology. According to a study by the Massachusetts Institute of Technology in 2018, in black women this technology had an error rate of 34.7% compared to 0.8% for light-skinned men. Similarly, a federal study in 2019 found that Asians and African Americans are 100 times more likely than white men to be misidentified using facial recognition technology.

While commission members unanimously backed the bill on Wednesday, some people spoke out during public testimony at the meeting. Most of the opposition was from law enforcement.

“These proven and ever-improving technologies allow peacekeeping officers investigating criminal incidents to reduce crime and protect civil liberties, focusing their efforts only on individuals with a much higher likelihood of involvement,” said David Shipley, executive director of the Consortium. consisting of 86 Colorado law enforcement.

Under the bill, law enforcement will be prohibited from using face recognition technology to establish probable causes, identify a person from a police sketch, or create a record that reflects the actions of a person protected by the First Amendment. Law enforcement will also need special permission to use face recognition to monitor, track or identify in real time.

Government agencies that use face recognition technology should notify the body supporting the report, clarify why the technology is used, prepare a report on accountability, test equipment, and subject any decisions resulting from this technology to human review.

The Colorado County Sheriff’s Office and the Security Industry Association also spoke out against the bill, with the latter criticizing its temporary ban on face recognition technology in schools.

“There are no excuses for banning use in schools. It could ban some potentially life-saving security applications, ”said Jake Parker, the association’s director of public relations. Parker said face recognition technology is used by schools to check visitors for a list of people barred from entering the school, and could help prevent gunfire.

The bill also provides for the establishment of a task force, which will operate until September 2032 and will be responsible for studying issues related to the use of artificial intelligence. The findings of the task force will be used to inform about the use of face recognition technology in schools after the ban on the bill expires in 2025, Hansen said.

Senate harshly approves bill banning law enforcement from cheating minors

While it is unclear whether any schools in Colorado currently use face recognition technology, public schools use this technology for discipline, such as to identify students who miss classes or break rules on security videos.

On Wednesday, several experts in the field of technology spoke in support of the bill, including several people from the University of Colorado’s artificial intelligence program.

“Face recognition is used in our schools and in our communities, almost always with good intentions, but certainly not always with good results, and sometimes even with harmful ones,” said Christine Chang, a computer science student at the university. “I would like to remind you of Robert Williams, Michael Oliver and Nyer Parks, all of whom were illegally arrested on the basis of findings from investigations obtained by face recognition software.”

Because of these problems in cities such as San Francisco, Boston and Portland, police and local agencies have banned the use of face recognition technology. The Colorado bill proposed will not ban the technology, but set strict restrictions.

University of Colorado law professor Margo Kaminski, who has experience in the comparative law of artificial intelligence, said the proposed bill would become the “gold standard” in the use of face recognition technology.

“The use of (artificial intelligence), especially by the government, creates certain risks,” Kaminsky said. “Risks of error, risks of bias and discrimination, and significant damage to personal liberty, including things like monitoring protesters, incentives to build surveillance infrastructure or to co-opt private surveillance systems for use by law enforcement.”

By a decision of 2-1 the appellate court upheld the sentence of the man for threatening the judge


Leave a Comment