Studies published by ACLU and the American Institute of Standards and Technology (NIST) highlight the fact that facial recognition systems are likely to cause errors when trying to match images of so-called colored people. This is most accurate when it comes to identifying white people, but it is likely to generate false positive results for black and Asian faces 10 to 100 times. The National Institute of Standards and Technology (NIST) has also highlighted the fact that systems developed in the United States do not perform well when confronted with the faces of Native Americans. However, this stems from the study by the American Institute of Standards and Technology that looked at the performance of 189 algorithms compiled by 99 developers. The program was evaluated on more than 18 million pictures of more than 8 million individuals.

It is the failure to take into account these elements that Robert Julian Borshak Williams finds himself a victim of. In fact, data is mainly distorted when you know that DataWorks Plus – the provider of artificial intelligence solutions – integrates algorithms (from subcontractors like Japan’s Nec and American Rank One Computing) that have been criticized by NIST for obvious racial biases. Additionally, DataWorks Plus realizes that tests on algorithms have been demonstrated to be lightweight.

The American Civil Liberties Union has taken up this issue. Detroit police acted inappropriately. Adopted without thinking about a flawed and racial facial recognition technology without taking reasonable steps to verify the information provided.

The investigation began when five hours, worth about $ 3,800, were stolen from the luxury Chinola store in Detroit in October 2018. Investigators examined the security footage and identified the suspect: a black-looking man in a baseball cap and dark jacket. In March 2019, according to the complaint, the Detroit Police conducted a search for facial recognition using an observation video. This research helped match the image to that of Mr. Williams driver’s license.

The identity led to the arrest of Williams outside his home last January. He was then placed in police custody for a total of 30 hours. One of his exchanges with the police stated that one of them said that “the computer should have been wrong” when it was impossible to prove his guilt.

For the sake of oversight, the suppliers of facial recognition techniques related to the case, with respect to them, intend to incorporate additional legal means into the process of using the tools they provide to the forces of demand. Detroit Police now restrict the use of facial recognition technology to violent crime and home invasions.

Source : ACLU

Related Articles
Leave a Reply

Your email address will not be published. Required fields are marked *