Police Facial Recognition Databases Log About Half Of Americans
RACHEL MARTIN, HOST:
If you've ever wondered how much the government knows about you, here's one more thing to consider - about half of all American adult faces are now searchable in police databases, and some law enforcement agencies can identify people on video surveillance in real-time. This is according to the Center on Privacy and Technology at Georgetown Law School. Alvaro Bedoya co-authored this new report, and he joins us now in our studios in Washington. Thanks for coming in.
ALVARO BEDOYA: Thanks for having me.
MARTIN: So explain how this works. What does it take for an ordinary citizen to end up in one of these databases?
BEDOYA: All it takes is to have a driver's license photo taken. This is a profound shift in the way we police society. So unless you've committed a crime, in general, you're not going to be in a criminal fingerprint database. You're not going to be in a criminal DNA database. Yet by simply standing for a driver's license photo, 26 states enroll you in basically a virtual lineup just like in the movies, except it's not a human being pointing to the suspect. It's an algorithm.
MARTIN: So this is going to come as a surprise to many people who didn't know this. Just by getting a driver's license...
MARTIN: ...In 26 states...
BEDOYA: That's right.
MARTIN: ...You're saying that that image is then put into a police database that they then cross-reference when a crime happens.
BEDOYA: So some of them are put into a police database. Others remain in the DMV database, but the police can either directly run searches or request them. There are no audits. There's little transparency. And so simply by being a citizen who drives, you are subject to thousands of warrantless searches. And there's no laws that control any of this.
MARTIN: How is it effective? Because clearly, law enforcement believes that this is a tool that's helpful to them.
BEDOYA: That's right. We agree that in some cases it is helpful. But we don't know, for example, how many mistakes it makes. A couple years ago, the FBI tested its own database, and it found that one out of seven times you searched that database, it returned a list of two to 50 innocent faces. And those mistakes aren't evenly distributed throughout the population. Another study showed that several prominent face recognition algorithms were less accurate on African-American, female and young faces.
BEDOYA: So they come down to things like women wearing cosmetics, young people's faces changing more and being smoother than older persons' faces. People with darker skin tones, the distinguishing features of their faces are harder to distinguish for the computer.
MARTIN: Can you give me a sense of tactically how it's used?
MARTIN: Like, what's a situation where it would be...
BEDOYA: Usually it's used - a police officer stops someone in the street, the person doesn't give their name. Or the police officer just wants to figure out who they are, so they take a photo of that person. You know, a police officer can take a photo of a peaceful protest. This was done in Baltimore to identify protesters in the Freddie Gray protests.
What L.A. is doing is the future of this technology. They are scanning every face that passes in front of a surveillance camera and, in real-time, comparing that face to a watch list. And it's not just L.A. that's doing this. West Virginia has bought the technology. Dallas has plans to deploy the technology.
I want this technology to be used to catch terrorists. I don't want it to be used to catch jaywalkers. And so ultimately, it is up for the people in these communities, for legislators and for regular citizens, to take action if they want to stop this because it - you know, at the end of the day, one of the things the report says is police officers are just using every tool at their disposal, right? It's our job to put that under control. And it's not right now.
MARTIN: Alvaro Bedoya of the Center on Privacy and Technology at Georgetown Law School. Thanks so much for talking with us.
BEDOYA: Thanks for having me. Transcript provided by NPR, Copyright NPR.