![]() "The results in the paper also do not use the latest version of Rekognition and do not represent how a customer would use the service today." "It's not possible to draw a conclusion on the accuracy of facial recognition for any use case - including law enforcement - based on results obtained using facial analysis," said Wood, who added the study didn't use Amazon's latest version of Rekognition, and said it found no false positive matches. Wood said the latest test results assign generic tributes aside from outlying factors like wearing glasses, wearing a mustache or being another gender. To save face, the general manager of artificial intelligence for Amazon Web Services, Matt Wood, said test results are based on facial analysis, not recognition. It's identifying women that became a glaring problem for the software.Īn MIT report says Amazon's Rekognition misidentified women as men 19 percent of the time, and that darker-skinned women were more often pegged as men in 31 percent of the study. The system worked fine when recognizing men. Sometimes the technology mistakes dark-skinned women as men. ![]() Facial technology software from Amazon used by some law enforcement agencies has shown inaccuracies, particularly when it comes to women of color.
0 Comments
Leave a Reply. |