Facial recognition: tech giants take a step back

Facial recognition: tech giants take a step back

Last week, a few major tech companies informed the public that they will not provide facial recognition software to law enforcement. These companies are concerned about the way in which their technology might be used.

What happens when software that threatens our privacy falls into the hands of organization which we no longer trust? In general, being aware of tracking software causes a feeling of being spied on and a feeling of insecurity. This insecurity that spreads throughout society is likely causing these companies to revise their strategy. Current developments surely have had an impact on an already distorted social environment. A pandemic and worldwide protests are a mix we have never experienced before in human history.

Definition of facial recognition

The definition of facial recognition, or “face recognition” as the Electronic Frontier Foundation (EFF) defines it, is:

A method of identifying or verifying the identity of an individual using their face. Face recognition systems can be used to identify people in photos, video, or in real-time.

Facial recognition is one of the technologies that even laymen can understand in how it can be used against citizens by a malevolent or untrustworthy government. Other methods like social profiling and behavioral analysis are more elusive and less easy to comprehend.

In an earlier blog, we already discussed the very different rules, laws and regulations that exist around the world when it comes to facial recognition. Depending on the type of government and the state of technology, the rules are very different—or they don’t exist at all.

The stated bans by Amazon, IBM, and Microsoft announced over the course of one week, however, were more or less directly aimed at US organizations, perhaps as a result of a growing distrust about local law enforcement agencies in general and due to the behavior of some police departments in particular. But we can likely expect these bans to spread out across the world. (And I think that is a good thing.) Laws have a tendency to follow the developments in society, always trailing one step behind. But in this case it looks important enough to wait until the development and legislature can go hand in hand.

The companies

Microsoft halted the sale of facial recognition technology to law enforcement in the US, stating that the ban would stick until federal laws regulating the technology’s use were put into place. In other words, they want to have rules in place for the use of the technology before they provide it.

Amazon, which is potentially one of the biggest players in this space, has their own custom tech called Rekognition. It’s being licensed to businesses and law enforcement. Earlier on, Amazon had already announced a similar ban for very much the same reason, letting the public know that it would require “stronger regulations to govern the ethical use of facial recognition technology.”

IBM did not limit the ban to the US but it did explain their motives in a letter to Congress. In this letter the company addressed the subject by writing it had no plans to market facial recognition software if it would be used “for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency.”

Why we do not want facial recognition

Many groups like American Civil Liberties Union (ACLU) and EFF have made objections against this technology as it is considered a breach of privacy to use biometrics to track and identify individuals without their consent. Many feel that there is already more than enough technology out there that keeps track of our behavior, preferences, and movement. The technology does not necessarily always know who we are down to the level of personally identifiable information (PII). Many people get uneasy when they find out how well aware advertisers and shops are of our preferences by tracking our browsing habits and online purchases.

And some incidents certainly don’t help the case at all. For example, the Baltimore police department reportedly ran social media photos through face recognition to identify protesters and arrest them.

Another example of using this technology for a purpose separate than what it was intended for—and also another possible reason for distrust—was the fact that Minnesota police resorted to what it called “contact-tracing” demonstrators arrested after recent protests. But “contact tracing” is a public health effort to help stop the spread of disease like the COVID-19 outbreak. As it turns out, the Minnesota police are looking at it as a model for criminal investigations.

Facial recognition still has its limits

Another objection against facial recognition technology has always been the inaccuracy. There are significant risks that facial recognition used in law enforcement is unreliable.

Most facial recognition software relies on Artificial Intelligence (AI) and, more precisely, Machine Learning (ML). Where facial recognition relies on machine learning the training data is often incomplete or unrepresentative of the general population. A study from MIT Media Lab shows that facial recognition technology works differently across gender and races. In cases where misidentification can lead to arrest or incarceration, we will surely want to avoid such grave errors due to false positives.

Will we ever be ready for facial recognition to be used by law enforcement?

What surely will need to happen is that law enforcement regains the trust of the public in general and that laws regulating the use of facial recognition software will be made effective to satisfy the demands of the manufacturers of facial recognition software.

Whether that means we can lie back and rely on the forces at work to do the right thing is a whole other topic. A large majority of humanity seems to be torn between “I have nothing to hide” and “they already know everything” anyway. That is not a healthy situation and the degree of unease largely depends on which country you happen to live in and many other circumstances beyond your control.

So, even though the chances of facial recognition getting widely used by law enforcement seem to be put on a lower level in the US, this remains a topic to keep an eye on if you value your privacy.


Pieter Arntz

Malware Intelligence Researcher

Was a Microsoft MVP in consumer security for 12 years running. Can speak four languages. Smells of rich mahogany and leather-bound books.