Science and Tech

Actions

ICE's Use Of Facial Recognition Tech Could Threaten Women Of Color

Face recognition tech is notoriously inaccurate when it comes to ID'ing people of color. So what might ICE's use of the tech mean?
Posted at 7:27 PM, Jul 10, 2019
and last updated 2019-07-10 19:27:52-04

Although they weren't present, both the FBI and ICE were under fire at a bipartisan hearing this week about the ethics of using facial recognition technology in their work. The Washington Post recently reported that ICE has been secretly using the tool to comb through driver's license photos — to target undocumented immigrants in states where they're legally allowed to get licenses.  

Lawmakers take issue with violating citizens' privacy and want more regulations, while activists want to ban it altogether. Although they're concerned about the invasion of privacy of all people, one of the main arguments against the technology is that it's been found to misidentify people of color — especially women. 

"Hi, camera. I've got a face. Can you see my face? No-glasses face? You can see her face. What about my face? I've got a mask. Can you see my mask?" 

Joy Buolamwiniis a computer scientist at MIT who found that facial recognition technology misclassified darker women's gender nearly 35% of the time. Meanwhile, when it came to white men, it was inaccurate less than 1% of the time.

Buolamwini: "When we analyze … by subgroups, we found all companies performed worse on darker females."

The ACLU also did its own test of Amazon's face recognition technology and found it mistakenly matched members of Congress with mugshots of people who committed crimes. The false matches were disproportionately people of color.

Suresh Venkatasubramanian is a computer science professor at the University of Utah. He says: "So what you're going to get is, for example, a system that's trained to do facial recognition is likely to have more of a false positive on minority groups, which means more people are going to be caught up in dragnets for no reason because the system flags them as a false positive. And that's going to be a big problem."

While we don't yet know how ICE is using information collected from facial recognition technology, it does raise concerns for the vulnerable populations with whom the agency interacts. For example, the number of pregnant women in ICE custody in inhumane conditions increased during President Trump's administration, and as many as 28 of them miscarried. Activists say ICE's reliance on flawed facial recognition system poses dangers beyond misidentification. 

We reached out to ICE to ask how they're using the technology, but they provided no detail: 

"Due to law-enforcement sensitivities, ICE will not comment on investigative techniques, tactics or tools."

At the hearing, Republican Representative Michael McCaul said the tool has "really protected the nation" from security threats. 

While Buolamwini says the tech's accuracy can be improved, Venkatasubramanian says the larger question is whether we should even adopt such systems considering the threats they pose to marginalized people.  

He said: "Historically, we know that whenever a new technology for surveillance comes up, it's always the underprivileged, the minorities and the poor, who are the ones who get surveilled the most, and the people who have wealth and privilege have ways of escaping that surveillance."