Updated: Jun 19, 2020
Facial recognition is everywhere. Set limits for yourself & your loved ones.
Over the past weeks, Americans have begun to come to terms with police brutality, especially as it targets Black, Indigenous and people of color. Law enforcement’s culture has long needed a re-think; I would have loved to have been an investigator, a job that seemed to prize intuition and require building trust and relationships to solve complex puzzles. But after two years training federal law enforcement on tech, I realized that: 1) my vision of policing doesn’t match reality; and 2) technology in the hands of people who saw good and bad as absolutes was dangerous.
Unchecked, unexamined uses of technology in policing result in false arrests, targeting of black and brown communities, and longer sentences and delayed paroles, consequences that range from ruining lives to ending them. With such high stakes, policing technologies and the algorithms governing them should be automatically audited. Yet, there is intentionally little transparency around the companies selling them, the requirements of their contracts, or their use by police forces. While several lawsuits are pending to remove barriers to information, it's increasingly clear that police forces approach communities like an enemy force.
Facial recognition is an especially problematic tool that thousands of police forces use. It can be used in real time, say during protests, and after the fact, to identity would-be looters, rioters, or others deemed troublemakers. Further, facial recognition is prone to errors, especially for faces of color; as any BIPOC knows, mistaken identity can get you killed. Last week, IBM announced the end of its loss-leader facial recognition program, and Microsoft said it would not sell its facial recognition capabilities to police until it's regulated. Amazon, on track to win Most Despicable Tech Company in 2020, put a moratorium on law enforcement sales of its software for one year, which doesn’t apply to 1) those forces already using its Rekognition software or 2) to federal law enforcement.
And it’s federal law enforcement that seems most likely to be using facial recognition software to identify protestors nationwide. A Customs and Border Patrol unmanned drone was flown over protests in Minneapolis in May. The DEA was granted permission to “conduct covert surveillance” and collect intelligence on protestors, although this violates their own remit.
Put boundaries on how much of your moneymaker they can access these four ways:
1) Avoid facial recognition where possible. After a 15-hour flight in January, I unwittingly used my face to get through customs. Shame on CBP; we were given no warning, and no way to opt-out after the fact. Next time you travel (ha?), see if CBP or TSA are using the tech in your airports, or if your airline does. Be intentional where you use your face for access; I unlock my iPhone with my face, and now Apple has my face mapped. Do I trust Apple’s security? Yes, more than I do TSA's. Do I trust them indefinitely with a thing I can never truly change? Errrr...
2) Thwart the systems. Before your next protest, check out Adam Harvey’s work, make a mask (and make a statement!), or pair your regular COVID mask with some sunglasses or these bitchin’ glasses. Pin a photo of a face on your clothes, or print and wear someone else’s face.
3) Don’t post photos of others' faces from protests or other sensitive situations without their permission, and use a blurring tool. Not only are police benefiting from your protest photos, so is ClearviewAI, a terrible tool that’s scraped 3 billion images from across social media (yes, against the terms of service of those companies), that connects your mug to your name and profile even if you’re not tagged or have been untagged in photos. We can’t know whether they’re using that powerful tool against protestors.
4) Fill them up with junk. If you can’t beat ‘em, then crank up the noise to slow them down. On top of the thousands of police forces using automated license plate readers to map your movements, casinos, social benefits agencies, and schools and universities also use them to make sure gambling addicts, welfare recipients, or students are ‘following the rules.’ Blow them up with this license plate fabric. If ClearviewAI’s arrogance bothers you (and it should!), John Oliver’s got an idea for you.
Protect yourself and others, and if you can’t, do some serious civil disobedience.
Updated June 19, 2020: It's US Customs and Border Patrol who use facial recognition via Global Entry at US ports of entry. However, TSA is expanding the use of facial recognition to match people to their IDs at security screening checkpoints.