Facial recognition tech is in the news again after the FBI discovered the identify of one of the Capitol rioters by using facial recognition software on his girlfriend’s Instagram posts. It may sound scary and invasive, but in truth, what’s happening isn’t particularly new. In this case, we have what’s fast becoming a fairly standard tale of tracking people down via online imagery. Sometimes there’s cause for concern even without the latest tech providing some sort of flashpoint.
After the Capitol riots following the US election, those responsible were slowly arrested over a period of weeks of searching and identifying. The Verge story mentions that in this effort, law enforcement made use of “facial recognition tools” to track down people associated with the event. The tool apparently brought researchers to the Instagram feed of a suspect’s girlfriend. It was a short step from there to matching his clothes with images from the Capitol riot.
Everything unravelled for the suspect quickly. Facebook accounts revealed his name. This brought investigators (via his state driving licence records) to his identity, workplace, and home.
We’ve covered facial recognition on the blog many times. Most concerns tend to focus on the potential for abuse from repressive Governments and law enforcement overreach. It’s such a concern that tech giants regularly dip in, and then quickly dip out when public opinion turns.
I don’t think many people will complain if facial recognition is used to help identify the people at the Capitol riots. Organisations find new ways to secure their sites with facial recognition and biometrics on a daily basis. You may or may not object to your bank combining facial recognition with AI software. These are potentially useful applications of this technology. Even so, we need to know what we’re dealing with for this story.
When pop culture and cold hard reality collide
Facial recognition is very much one of those technologies made a cliche for all time by film and television. The camera zooms in from orbit, it picks up the target in seconds, the operator is able to tell where the suspect bought his suit by enhancing the fibers on his jacket and so on.
The reality here is, “some people used a program to play mix and match with publicly available photographs”. The end result is still impressive, but CSI: Cyber this is not.
Impressive, but not CSI: Cyber
How does this work, then? Well, the article mentions “open source facial recognition tools”. The affidavit doesn’t say which tool, because law enforcement doesn’t want to give perpetrators clues for avoiding the long arm of the law. You can see some of the more popular tools available here, if you’re interested in learning more or giving them a go.
Otherwise, there are many other ways to match images with the raft of materials floating around online. TinEye is a dedicated online tool for matching images, and Google / Bing / Yandex search all offer their own versions of this functionality. A little bit of sleuthing and familiarity with OSINT practices can go a long way.
A sliding scale of “that’s impressive”
One of the best examples of this happened just recently, with a lost hiker pinpointed via a photograph. To me, this is significantly more impressive than digging a fairly distinctive individual out from a never-ending pile of selfies and readily available data on popular image sharing websites. As a result, I’d say this one is interesting, but definitely nothing new. Crowdsourcing also has a history of going horribly wrong, and the infamous Reddit Boston Bombing debacle is as good a place to drop this warning as any.
We’ll definitely see more of these stories in the near future, but I wouldn’t necessarily start panicking about this branch of open sourcing just yet.