Amazon pushed Immigration and Customs Enforcement officials in June to purchase and use its controversial facial recognition technology, according to Amazon Web Services emails obtained by a government watchdog.
Documents turned over in a Freedom of Information Act request by the Project On Government Oversight show Amazon’s technology subsidiary offered up its real-time “Rekognition” video analytics software to aid ICE’s Homeland Security Investigations.
Amazon touts the technology on its website as capable of providing “real-time face recognition across tens of millions of faces, and detection of up to 100 faces in challenging crowded photos.”
Amazon Web Services and Department of Homeland Security officials also met in mid-June at the Redwood, Calif., offices of McKinsey and Company, a consulting firm that formerly worked with ICE.
It’s unclear how, exactly, Amazon proposed the technology be implemented. The email from Amazon Web Services only vaguely refers to “a big HSI problem.” Amazon did not immediately respond to a request for comment.
In an emailed statement to HuffPost, ICE spokesman Matthew Bourke said the agency doesn’t currently have a contract with Amazon for Rekognition, but noted it’s “fairly standard” for the agency to evaluate how any emerging technology might help its mission.
Bourke confirmed ICE has used facial recognition in the past “to assist during the course of criminal investigations related to fraudulent activities, identity theft and child exploitation crimes.”
In May, inspired by an American Civil Liberties Union report on the technology, a group of concerned Amazon employees sent CEO Jeff Bezos a public letter asking that the company stop providing facial recognition services to police departments and other government agencies.
“We already know that in the midst of historic militarization of police, renewed targeting of Black activists, and the growth of a federal deportation force currently engaged in human rights abuses — this will be another powerful tool for the surveillance state, and ultimately serve to harm the most marginalized,” the employees wrote.
“Our company should not be in the surveillance business; we should not be in the policing business,” they added. “We should not be in the business of supporting those who monitor and oppress marginalized populations.”
A follow-up investigation by the ACLU in July found the software has problematic accuracy issues, with false matches disproportionately identifying people of color as known criminals. To illustrate its point, the ACLU used the software to screen members of Congress against a database of 25,000 mugshots; 28 U.S. Representatives were incorrectly flagged as criminal, with the results clearly skewing along racial lines.
“Nearly 40 percent of Rekognition’s false matches in our test were of people of color,” the ACLU found, “even though they make up only 20 percent of Congress.”