Two computer-generated images of two separate eyes, one framed in a circle, the other framed in a square

In conversation: Bruce Schneier on AI-powered mass spying

For decades, governments and companies have surveilled the conversations, movements, and behavior of the public.

And then the internet came along and made that a whole lot easier.  

Today, search engines collect our queries, browsers collect our device information, smartphones collect out locations, social media platforms collect our conversations with one another, and, depending on the country, governments either collect that same information from the companies that maintain it, or they gather it directly themselves by monitoring their people.

That’s a lot of data that, until now, has been difficult to parse at scale.

But cryptographer and computer security professional Bruce Schneier believes that’s going to change, all because of the advent of artificial intelligence.

Already equipped with reams of data, governments and companies will now be able to ask AI models to make conclusions from the data, Schneier said, supercharging the human work of inquiry with the limitless scale of AI.

“Spying is limited by the need for human labor,” Schneier wrote. “AI is about to change that.”

In January, Senior Privacy Advocate David Ruiz (and host of the Lock and Code podcast) spoke with Schneier about the future of AI-enabled mass spying—the implications, the responses from the public, and whether there’s any hope in dismantling a system so deeply embedded in the modern world.

Below is an edited transcript of their conversation.

DAVID RUIZ: We know that mass surveillance has this “Collect it all” mentality—of the NSA, obviously, but also from companies that gather clicks and shares and locations and app downloads and all of that.

What is the mass spying equivalent of that?

BRUCE SCHNEIER: So, it’s “Collect it all,” but it’s “Collect different things.”

Let’s step back here.

We kind of know what mass surveillance, mass spying looks like. It’s the kind of thing we saw in former East Germany, where like 10 percent of the population spied on 90 percent of the population. And it was incredibly manpower intensive.

Now, we moved into a world of automatic mass surveillance with the rise of the internet and the rise of cheap data storage and processing. This is maybe 20 years ago, when companies and the NSA start collecting mass surveillance data, and that is metadata—the kind of data that Snowden talked about—data about data. Data from your phone, from your browser, who you spoke to, where you went, what you spent money on, what websites you visited, that’s kind of mass surveillance data.

Spying, the difference I’m making, is about conversations.

And while computers were easily able to interpret location data and data about who you spoke to—your email “from” and “to” lines—they weren’t able to interpret what you said.  Voice conversations, text conversations, email conversations, they could do keyword searches and if you think about, Chinese censorship is based pretty naively on keyword searches. [For instance,] you say the bad word, you get censored. That’s still very primitive.

Spying—listening in on a conversation, knowing what people are saying—required human beings, and there weren’t enough humans to listen to everything. So, you had this automatic surveillance: We can know where everybody is, that’s easy.  But knowing what everybody’s saying, we still couldn’t do—we didn’t have the people.

As AI becomes increasingly capable of understanding and even having conversations, it can start doing the role that people used to do and engage in this kind of spying at a mass level.

Now, the question you asked started me in this monologue is what did that look like? And the answer is: We have no idea.

We kind of know what East Germany looked like, former Soviet Union. We do know what some of these countries look like. But I don’t think it’s the same. It is not this kind of mass spying by corporations. Or by democratic governments. So, we don’t know. But it’s worth at least thinking about. 

DAVID RUIZ: When I was trying to answer my own question, I feel like one of the potential futures here is, instead of “Collect it all,” it’s this attempt at “Know it all.”

BRUCE SCHNEIER: It always was “Know it all,” but now it becomes more possible.

DAVID RUIZ: Yeah, and I was hypothesizing that there might be companies that say “We have created a package of questions that we use on AIs that we know and trust to discover these new insights,” and [they] sell these packages to companies that are trying to find out XYZ and ABC about their customers, or governments about their persons.

And that already sounds awful. It sounds both boring and dreadful, which is what I think surveillance is today—it is boring operationally in terms of [how] we don’t see data brokers. We don’t see what gets collected every single day behind the interface of the web. But it is used so much to power the direction of the web.

BRUCE SCHNEIER: And my guess is that’s right. And it’s not actually boring. Certainly, the companies wanted you to think that. It is very hidden. Data brokers spent a lot of effort making sure that there’s no visibility into their industry, so we don’t know the data being collected on us.

But remember, this is still all surveillance data. It’s not the contents of your phone calls, it’s the billing information. It’s who you called, what time, how long you spoke, that kind of information. Maybe the location of your two phones when you were talking. This brings it up to another level. 

DAVID RUIZ: You mentioned that this is the conversations—the content of communications. And I wanted to address that immediately because I think whenever spying is discussed, a lot of folks picture someone rifling through text messages or emails. And I think some people might balk at that. They might say and reject the premise based on things like “Oh, Apple can’t look at my iMessages, and Google said it wouldn’t advertise based on email content some years ago, and my phone provider, I don’t think, is recording my phone calls.”

So, my question after those kinds of claims, is: Is that too narrow a lens to understand this future of mass spying? What am I missing when I say those things to calm myself down?

BRUCE SCHNEIER: I think it’s all economics. Google did realize that eavesdropping on the contents of your Gmail wasn’t terribly useful. So, they didn’t do it. Of course, they made a virtue out of it. That stuff will change.

We’re already seeing [the questions of] “Is this company using your data to train their AI? Will it be used to influence how an AI interacts with you?”[And] already we can see that the tones of voice we use with these AI chatbots is mirrored. They say they’re not storing it and using it for training—we don’t know the answer.

As this stuff becomes more valuable, as companies see a business interest in using it, of course it’s going to be used. It’s not going to be a world where we [can] rely on the goodness of the hearts of for-profit corporations not to abuse our privacy.

So, it’s a matter of how cheap it is. It is going to become cheap for Google to eavesdrop on the contents of everyone’s email conversations, or for Zoom to eavesdrop on all meetings? They currently say they’re not [monitoring meetings in] their terms of service, they’ll probably [receive] push back if they change it—they were already pushed back because there was a thought that Zoom would be using your information to train an AI. [It] turned out to not be true, but there was a little panic there. But how long does that last? 

We saw this with Facebook and their surveillance over the years and decades.

They did a new thing, there was a pushback, there’s an outcry, and after a couple months, it was the new normal. So, I don’t know where this is going to go, but the tech is moving to the point where this stuff is possible and, in fact, easy. 

DAVID RUIZ: We talked about the business case for [AI-enabled mass spying] and I wanted to ask more broadly: Who will engage in AI mass spying?

BRUCE SCHNEIER: The answer is going to be everybody who can.

Who are the major players here? Certainly there are corporations who are doing it for manipulation purposes. Surveillance is the business model of the internet because advertising is the business model of the internet, and advertising is convincing you to do something that you might not have done otherwise.

So, this surveillance-based manipulation is the business model, and anything that gives a company an advantage, they’re going to do. We [already] saw that with personalized advertising [which is] based on characteristics that you might not be happy sharing.

So, companies will do that.

Now, in the West, we tend not to have governments do this on their own citizens—they rely on corporations. The US government doesn’t spy on us directly, they get the spying data from corporations who do that to everybody.

You go to other countries, more totalitarian countries, and governments do engage in surveillance and will engage in spying. Here, I’m thinking about China, but other countries as well. So, very much think of it in terms of power:

Those that have the power will engage in the behaviors because it magnifies their power. There is no surprise. 

DAVID RUIZ: There’s a lot of times where we could think “Why would a company do this? Why would a government do this?” And there’s a lot of alleged reasons. The NSA will say that it collects as much data as possible for national security reasons, but they’ve been saying that for decades, and the national security reason seems to change every few years, right? There’s a, there’s a “national security risk du jour.”

And it’s much easier and simpler to think of it as those that have power hope to keep that power and amass more of it.

I wanted to separately ask: We talked about corporations. We talked about governments. What about people? I have access to AI tools in a way that I don’t have access to data collection regimes.

So, are people going to spy on people?

BRUCE SCHNEIER:  People are already spying on people. It’s more one-on-one. There’s an entire industry of super creepy spyware that is sold to people who want to spy on their wives and girlfriends. This is kind of a gross industry, but it exists and it is used in many countries.

So, yes. But here again, the power matters, and it’s generally the more powerful spying on the less powerful in a relationship. It’s generally the man spying on the woman because that’s the power imbalance.

When you think about bulk surveillance, it’s access to the data.

I really can’t spy on my neighborhood. Sure, I can set up a camera in front of my house and watch what’s going on in my street, but that’s all I could do. I don’t have the power to put cameras everywhere. I don’t have the power to get your email or your Zoom stream.

So, I think you are going to see some use of these technologies by people who are not traditionally powerful. But in general, like all these technologies, they benefit the powerful more than the less powerful.

DAVID RUIZ: We know that people act differently when they know they’re being surveilled—will that also apply to mass spying?

BRUCE SCHNEIER: Certainly, people in the United States have lived out of this kind of regime in post-9/11. Lots of Muslims lived in a world where they were being spied on a lot.

Lots of people—Jon Penney comes to mind, there are others—have researched how the feeling that you’re under constant surveillance—or spying, they’re both the same here—leads to self-censorship.

If you think you are being watched all the time, you behave differently. We do know that there’s an enormous chilling effect on how you behave, what you do, on your conformity. And if you think you’re being watched all the time, you tend to conform. You’re not going to do something different. You’re not going to stand out.

This seems really bad for society because that’s how society innovates. A world where people don’t try new things is a world that stagnates.

To take an easy example, about 10 years ago, I forget the year, gay marriage became legal in the United States, in all 50 states. It became the law of the land and that change was the result of a multi-decade process. For a while it was illegal and immoral, then it was illegal and moral, and then it became legal and moral.  And in order for that to happen—in order for that whole progression—somebody way back in the beginning had to try gay sex once and say: “You know, that wasn’t so bad. That was kind of fun.”

And the reason they were able to do that is that they weren’t being watched. They could do it in the privacy of their own bedroom and no one could stop them.

If you live in a world where, whether it’s gay sex or marijuana or whatever it is that becomes the mainstream moral norm over several generations, if you can’t try it, if there can’t be a counterculture of doing it, it never will become the social norm. If everybody who tried pot in the ‘50s was immediately discovered and arrested, you’d never get to legalization. You never would get a generation of people that would say, “You know, that’s not that bad. Why are we criminalizing this kind of not harmful drug?”

DAVID RUIZ: On the current environment that we live in, where so much surveillance happens every day—again, both from corporations and from governments—it doesn’t seem like there’s any way to dismantle it. And it feels like the potential of mass spying to produce even deeper insights means that we’ve lost the battle on surveillance. Have we lost that battle?

BRUCE SCHNEIER: I never think it’s too late, and that kind of fatalism doesn’t make sense when you look at history. It’s like saying in the 1200s, “Well, we tried to fight against monarchy. I guess that didn’t work. It’s too late.” Or centuries later, “You know, we tried to fight against slavery, that didn’t work.” Or, “You know, we’ll never give women the vote. That ship has sailed.”

We, as a species, regularly make our society more moral, more ethical, more egalitarian. It’s slow, it’s bursty, but decade over decade, century over century, we are improving.

So, no, I don’t think from now until the end of our species, the level of surveillance we see today cannot be rolled back. I think that is ridiculous.

We no longer send five-year-olds up chimneys to clean them. We don’t do that. We changed. We no longer allow companies to sell pajamas that catch on fire. We changed. We can do that here.

Like the other big things, like monarchy, like slavery, like the patriarchy, these things are going to be hard to dismantle, but they are dismantlable.

Near term, I think you’re right. Near term, both companies and governments are just punch-drunk on our data, and they’re not going to give it up. But long term, lots of things are possible and will happen. 

DAVID RUIZ: I mean this as a high compliment: You are the most optimistic guest we’ve had on the podcast.

BRUCE SCHNEIER: I’m near term pessimistic and long term optimistic. Near term, I think we’re screwed. The tech monopolies are so powerful, and we saw that with social media. Both Republicans and Democrats agreed—this never happened before—that Facebook was harming our society in different ways, but it’s harming society. [They] called Zuckerberg in, called the other companies in, yelled at them, [said] something must be done, and nothing was done.

That is sheer lobbying power right there in operation.

So, near term, I don’t see any solution, but our species has handled harder problems than this. This won’t be the one that stumps us.

DAVID RUIZ: What do we do at this point? 

BRUCE SCHNEIER: I want this to be a political issue. This stuff changes when it becomes an issue that voters care about. If there is a debate question on this, if this becomes something that politicians are asked about, then change will happen, right? If it isn’t, then it is really just the lobbyists that get to decide what happens.

“What should we do?” is agitate for change. Make this political, make this something that politicians can’t ignore.

Where change is happening is the EU. You have listeners in the EU, and they will know that things are happening there. Right now, Europe is the regulatory superpower on the planet. They are the jurisdiction where we got a comprehensive data privacy law, where they are passing an AI security law, stuff that you would never see in the United States.

So, look outside the US right now, but make this political. That’s how we’re going to make it better.

But we’re fighting uphill. It’s very hard in the United States to enact policies that the money doesn’t want. Money gets its way in in US policy. And the money wants this. 

DAVID RUIZ: And I think disentangling money from politics in the United States is a different [conversation], and unfortunately we don’t have the time for it.

BRUCE SCHNEIER: No, surely you can solve that in half an hour.

DAVID RUIZ: Actually, are you booked for the next half hour?

BRUCE SCHNEIER: If I was able to solve that, I would be not doing what I’m doing now, because, you’re right, that might not be the most important problem, but as Professor Larry Lessig said, it’s the first problem. It’s a problem we need to solve to solve every other problem. 

ABOUT THE AUTHOR

David Ruiz

Pro-privacy, pro-security writer. Former journalist turned advocate turned cybersecurity defender. Still a little bit of each. Failing book club member.