Last month, US Attorney General William Barr resurrected a government appeal to technology companies: Provide law enforcement with an infallible, “secure” method to access, unscramble, and read encrypted data stored on devices and sent across secure messaging services.
Barr asked, in more accurate, yet unspoken terms, for technology companies to develop encryption backdoors to their own services and products. Refusing to endorse any single implementation strategy, the Attorney General instead put the responsibility on cybersecurity researchers and technologists.
“We are confident that there are technical solutions that will allow lawful access to encrypted data and communications by law enforcement without materially weakening the security provided by encryption,” Attorney General Barr said.
Cybersecurity researchers, to put it lightly, disagreed. To many, the idea of installing backdoors into encryption is antithetical to encryption’s very purpose—security.
Matt Blaze, cybersecurity researcher and University of Pennsylvania Distributed Systems Lab director, pushed back against the Attorney General’s remarks.
“As someone who’s been working on securing the ‘net for going on three decades now, having to repeatedly engage with this ‘why can’t you just weaken the one tool you have that actually works’ nonsense is utterly exhausting,” Blaze wrote on Twitter. He continued:
“And yes, I understand why law enforcement wants this. They have real, important problems too, and a magic decryption wand would surely help them if one could exist. But so would time travel, teleportation, and invisibility cloaks. Let’s stick to the reality-based world.”
Blaze was joined by a chorus of other cybersecurity researchers online, including Johns Hopkins University associate professor Matthew Green, who said plainly: “there is no safe backdoor solution on the table.”
The problem with backdoors is known—any alternate channel devoted to access by one party will undoubtedly be discovered, accessed, and abused by another. Cybersecurity researchers have repeatedly argued for years that, when it comes to encryption technology, the risk of weakening the security of countless individuals is too high.
In 2014, Apple pushed privacy to a new standard. With the launch of its iOS 8 mobile operating system that year, no longer would the company be able to access the encrypted data stored on its consumer devices. If the company did not have the passcode to a device’s lock screen, it simply could not access the contents of the device.
“On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode,” the company said.
The same standard holds today for iOS devices, including the latest iPhone models. Data that lives on a device is encrypted by default, and any attempts to access that data require the device’s passcode. For Android devices, most users can choose to encrypt their locally-stored data, but the feature is not turned on by default.
Within two years of the iOS 8 launch, Apple had a fight on its hands.
Following the 2015 terrorist shooting in San Bernardino, Apple hit an impasse with the FBI, which was investigating the attack. Apple said it was unable to access the messages sent on an iPhone 5C device that was owned by one of the attackers, and Apple also refused to build a version of its mobile operating system that would allow law enforcement to access the phone.
Though the FBI eventually relied on a third-party contractor to crack into the iPhone 5C, since then, numerous messaging apps for iOS and Android have provided users with end-to-end encryption that locks even third-party companies out from accessing sent messages and conversations.
Signal, WhatsApp, and iMessage all provide this feature to users.
Upset by their inability to access potentially vital evidence for criminal investigations, the federal government has, for years, pushed a campaign to convince tech companies to build backdoors that will, allegedly, only be used by law enforcement agencies.
The problem, cybersecurity researchers said, is that those backdoors do not stay reserved for their intended use.
In 1993, President Bill Clinton’s Administration proposed a technical plan to monitor Americans’ conversations. Installed within government communications networks would be devices called “Clipper Chips,” which, if used properly, would only allow law enforcement agencies to listen in on certain phone calls.
But there were problems, as revealed by Blaze (the same cybersecurity researcher who criticized Attorney General Barr’s comments last month).
In a lengthy analysis of the Clipper Chip system, Blaze found glaring vulnerabilities, such that the actual, built-in backdoor access could be circumvented.
By 1996, adoption of the Clipper Chip was abandoned.
Years later, cybersecurity researchers witnessed other backdoor failures, and not just in encryption.
In 2010, the cybersecurity expert Steven Bellovin—who helped Blaze on his Clipper Chip analysis—warned readers of a fiasco in Greece in 2005, in which a hacker took advantage of a mechanism that was supposed to only be used by police.
“In the most notorious incident of this type, a cell phone switch in Greece was hacked by an unknown party. The so-called ‘lawful intercept’ mechanisms in the switch—that is, the features designed to permit the police to wiretap calls easily—was abused by the attacker to monitor at least a hundred cell phones, up to and including the prime minister’s,” Bellovin wrote. “This attack would not have been possible if the vendor hadn’t written the lawful intercept code.”
In 2010, cybersecurity researcher Bruce Schneier placed blame on Google for suffering a breach from reported Chinese hackers who were looking to see which of its government agents were under surveillance from US intelligence.
According to Schneier, the Chinese hackers were able to access sensitive emails because of a fatal flaw by Google—the company put a backdoor into its email service.
“In order to comply with government search warrants on user data, Google created a backdoor access system into Gmail accounts,” Schneier said. “This feature is what the Chinese hackers exploited to gain access.”
Interestingly, the insecurity of backdoors is not a problem reserved for the cybersecurity world.
In 2014, The Washington Post ran a story about where US travelers’ luggage goes once it gets checked into the airport. More than 10 years earlier, the Transpiration Security Administration had convinced luggage makers to install a new kind of lock on consumer bags—one that could be unlocked through a physical backdoor, accessible by using one of seven master keys which only TSA agents were supposed to own. That Washington Post story, though, revealed a close-up photograph of all seven keys.
Within a year, that photograph of the keys had been analyzed and converted into 3D printing files that were quickly shared online. The keys had leaked, and the security of nearly every single US luggage bag had been compromised. The very first flaw only required human error.
Worth the risk?
Attorney General Barr’s comments last month are part of a long-standing tradition in America, in which a representative of the Department of Justice (last year it was then-Deputy Attorney General Rod Rosenstein) makes a public appeal to technology companies, asking them to install backdoors as a means to preventing potential crime.
The arguments on this have lasted literal decades, invoking questions of the First Amendment, national security, and the right to privacy. That argument will continue, as it has today, but encryption may pick up a few surprising defenders along the way.
On July 23 on Twitter, the chief marketing officer of a company called SonicWall posted a link to a TechCrunch article about the Attorney General’s recent comments. The CMO commented on the piece:
Michael Hayden, the former director of the National Security Agency—the very agency responsible for mass surveillance around the world—replied:
“Not really. And I was the director of national security agency.”
The fight to install backdoors is a game of cat-and-mouse: The government will, for the most part, want its own way to decrypt data when investigating crimes, and technologists will push back on that idea, calling it dangerous and risky. But as more major companies take the same stand as Apple—designing and building an incapability to retrieve users’ data—the public might slowly warm up to the idea, and value, of truly secure encryption.