SEATTLE — Amazon’s announcement of new enhancements to its “Rekognition” facial recognition product is drawing concerns from privacy advocates.
The company announced this week that the Rekognition product in Amazon Web Services could now detect the emotion fear on faces of people in images. That's on top of seven other basic emotions – happy, sad, angry, surprised, disgusted, calm and confused.
Shankar Narayan, of the American Civil Liberties Union Washington, said this kind of facial recognition tech raises serious concerns about utilization and privacy.
An ACLU post called for a moratorium on law enforcement use of facial recognition until more discussion of the impacts is had.
“It’s very hard to change your face – your face is your face,” Narayan said.
“I think what we really need is a conversation that looks back at history, looks at the way surveillance technology has been used for a very long time, which impacts groups that are already marginalized or disproportionately impacted, and think about how that impacts our democracy," said Narayan.
Amazon touted many uses for the tech on its website – identifying elements in a scene for searches, transcribing text in images, and monitoring unsafe content online. The company also said it could be used to comb social media to find missing persons.
But the ACLU is concerned by other possible uses, especially on people from diverse backgrounds and cultures that might not be interpreted accurately.
“I think what it illustrates is these AI-based determinations about us, that are sort of made in a black box, with data we can’t access, through a decision-making process we can’t understand, are going to be used in public to make important decisions about peoples’ lives,” said Narayan. “Imagine a police body camera that has emotion detection that can tell an officer whether someone is aggressive or fearful – they may need to use that information to make a split-second decision about whether to use deadly force.”
The ACLU recently used the Rekognition tool to compare members of Congress to a database of mugshots, and said it found 28 false matches.
Amazon contended that the ACLU did not use its tool correctly.
“The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines,” a spokesperson wrote. “As we’ve said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking. We continue to advocate for federal legislation of facial recognition technology to ensure responsible use, and we’ve shared our specific suggestions for this both privately with policymakers and on our blog.”
Narayan said in response that they used the tool the way any other user could.
Amazon said the only law enforcement customer is the Washington County Sheriff’s office, near Portland, Oregon.
Narayan still voiced concerns about powerful technology existing without mitigating regulation.
“For me, I think I signed up to live in a free state, free country, where I’m not tracked as I move around, my emotions aren’t read by machines and used in ways I have no control over,” he said. “That’s really what we’re talking about fundamentally here.”