In a nutshell, face recognition technology is inherently racist because the darker the skin the worse the results, and its job isn’t to report that what it sees isn’t known, but that what it sees is x-percent known. This leads to corralling the wrong people and making them prove they weren’t there – proving innocence while in prison – while there is a mountain of “evidence” against them. The darker the skin, the more often this will happen – it’s a white supremacist’s dream technology.
Any technology that works differently for one person than another is problematic. Implementing such technology nationwide against the will of the people – especially without their knowledge – is catastrophic.
Also, “the Internet doesn’t forget” – all data recorded stays there, and over years it’s always possible to find someone guilty of something, be it jaywalking that one day at lunch or “running a red light”, that was actually yellow and if you would have slammed on your brakes you would have skidded and caused an accident… etc. Give it a few years and they’ll be able to find something against anybody in the country, and even if it’s frivolous it’s up to you to prove otherwise. What were you doing January 15th, 2015 at 3:19pm? What was context?
Consider all the HALO cameras in use in, say, Denver:
What’s the problem, they look useful, yes? Matching these cameras with facial recognition is not only dangerous but twisted: normally and to quote the video, HALO cameras “are forbidden from monitoring [areas expected to be private, such as businesses or homes] unless there is a crime investigation in progress”. This makes sense : Crime –> Investigation –> Footage –> Suspect . A crime caused research which uncovered perpetrator(s).
By using facial recognition they start with people, not crimes, which twists the process to People –> Suspect –> Match to crime . Now people in an area are identified and, if correlated, are matched to a crime.
The fifth amendment to the Constitution says that we cannot bear witness to our own crime: it places the impetus on the prosecution. With these technologies working together, the impetus is placed on the defense to prove innocence with a mountain of potentially flawed data that says the given person is guilty. The darker the skin, the more flawed that data is and the harder time the person has proving their innocence.
This is the more subtle correlation, never mind the obvious problem of surveillance all citizens are now living under.
Update – fortunately the ACLU is already all over this and Amazon has changed their target market and surrounding verbiage as a result, but they’re not stopping sales to law enforcement:
Highlights from the article include:
- It’s already been implemented in some locations for a year. We’ve gotten to the point in secret government and secret laws that law enforcement is keeping methods from the public “so that the bad guys don’t know what they can do”, not remembering that their means and methods need to be approved by the people they are sworn to protect.
- The system works by comparing what the cameras see to a known database, so law enforcement has to populate that database. They have done so “legally” and from public sources – in other words, Facebook!
- Most of their usage is innocuous (“We’re only looking for the sleeping and deceased”) but the other isn’t (using it for “leads for possible witnesses and accomplices”) – while true, if there’s a mismatch they wouldn’t know and would make the defendant prove the system is wrong.
- “The Sheriff’s Office does not use the technology for mass and/or real time surveillance. In fact, state law and our policy prohibits it for such use” – except that they’re looking for “leads for possible witnesses” – they can’t do that without surveillance at some level.
Defeating facial recognition is getting increasingly difficult, but people are working the problem:
- Shades and a bandanna aren’t good enough, you have to attack the tech : https://www.newscientist.com/article/2146703-even-a-mask-wont-hide-you-from-the-latest-face-recognition-tech/
- technology-tricking glasses you can print at home: https://www.theverge.com/2016/11/3/13507542/facial-recognition-glasses-trick-impersonate-fool
- Other glasses are becoming available / being worked on : https://www.fastcompany.com/3050252/yes-anti-facial-recognition-glasses-are-coming
- make-up and hairstyles have to be pretty radical, but it’s a good summertime method because masks are so incredibly obvious.
- Mess with the shading and contours, that’s what they key on. This kind of mask would work because it adds light and dark contours (image from www.piratesofpowder.com):