Consumers have few legal options for protecting privacy

Consumers have few legal options for protecting privacy

There are no promises in the words, “We care about user privacy.”

Yet, these words appear on privacy policy after privacy policy, serving as disingenuous banners to hide potentially invasive corporate practices, including clandestine data collection, sharing, and selling.

This is no accident. It is a strategy.

In the US, companies that break their own privacy policies can—and do—face lawsuits over misleading and deceiving their users, including making false statements about data privacy. But users are handicapped in this legal fight, as successful lawsuits and filings are rare.

Instead of relying on the legal system to assert their data privacy rights, many users turn to tech tools, installing various web browsers, browser extensions, and VPNs to protect their online behavior.

Luckily, users aren’t alone in this fight. A small number of companies, including Apple, Mozilla, Signal, WhatsApp, and others, are truly committed to user privacy. They stand up to overbroad government requests. They speak plainly about data collection. And they often disengage from practices that put user data in the hands of unexpected third parties.

In the latest blog in our series on data privacy and cybersecurity laws, we look at the options that consumers actually have in asserting their digital privacy rights today. In the US, it is an area of law that, unlike global data protection, is slim.

As Jay Stanley, senior policy analyst with the ACLU Speech, Privacy, and Technology Project, put it: “There’s a thin web of certain laws that exist out there [for digital consumer privacy], but the baseline default is that it’s kind of the Wild West.”

Few laws, few protections

For weeks, Malwarebytes Labs has delved into the dizzying array of global data protection and cybersecurity laws, exploring why, for instance, a data breach in one state requires a different response than a data breach in another, or why “personal information” in one country is not the same as “personal data” in another.

Despite the robust requirements for lawful data protection around the world, individuals in the United States experience the near opposite. In the US, there is no comprehensive federal data protection law, and thus, there is no broad legal protection that consumers can use to assert their data privacy rights in court.

“In the United States, the sort of default is: Consumer beware,” said Lee Tien, senior staff attorney with the digital rights nonprofit Electronic Frontier Foundation.

As we explored last month, US data protection law is split into sectors—there’s a law for healthcare providers, a law for video rental history, a law for children’s online information, and laws for other select areas. But user data that falls out of those narrow scopes has little protection.

If a company gives intimate menstrual tracking info to Facebook? Tough luck. If a flashlight app gathers users’ phone contacts? Too bad. If a vast network of online advertising companies and data brokers build a corporate surveillance regime that profiles, monitors, and follows users across websites, devices, and apps, delivering ads that never disappear? Welcome to the real world.

“In general, unless there is specific, sectoral legislation, you don’t have much of a right to do anything with respect to [data privacy],” Tien said.

There is one caveat, though.

In the US, companies cannot lie about their own business practices, data protection practices included. These laws prohibit “unlawful, unfair, or fraudulent” business practices, along with “unfair, deceptive, untrue, or misleading” advertising. Whatever a company says it does, legally, should be what it actually does, Tien said.

“Most of consumer privacy that’s not already controlled by a statute lives in this space of ‘Oh, you made a promise about privacy, and then you broke it,’” Tien said. “Maybe you said you don’t share information, or you said that when you store information at rest, you store it in air-gapped computers, using encryption. If you say something like that, but it’s not true, you can get into trouble.”

This is where a company’s privacy policy becomes vital. Any company’s risk for legal liability is only as large as its privacy policy is detailed.

In fact, the fewer privacy promises made, the fewer opportunities to face a lawsuit, said ACLU’s Stanley.

“This is why all privacy policies are written to not make any promises, but instead have hand-wavy statements,” Stanley said. “What often follows a sweeping statement is 16 pages of fine print about privacy and how the company actually doesn’t make any promises to protect it.”

But what about a company that does make—and break—a promise?

Few laws, fewer successful assertions

Okay, so let’s say a company breaks its data privacy promise. It said it would not sell user data in its privacy policy and it undeniably sold user data. Time to go to court, right?

Not so fast, actually.

The same laws that prohibit unfair and deceitful business practices also often include a separate legal requirement for anyone that wants to use them in court: Individuals must show that the alleged misconduct personally harmed them.

Proving harm for something like a data breach is exceedingly difficult, Tien said.

“The mechanism of harm is more customized per victim than, say, an environmental issue,” Tien said, explaining that even the best data science can’t reliably predict an average person’s harm when subjected to a data breach the way that environmental science can predict an average person’s harm if they’ve been subjected to, for instance, a polluted drinking source.

In 2015, this difficulty bore out in court, when an Uber driver sued the ride-hailing company because of a data breach that affected up to 50,000 drivers. The breach, the driver alleged, led to a failed identity theft attempt and a fraudulent credit card application in his name.

Two years later, the judge dismissed the lawsuit. At a hearing she told the driver: “It’s not there. It’s just not what you think it is…It really isn’t enough to allege a case.”

There is, again, a caveat.

Certain government officials—including state Attorneys General, county District Attorneys, and city attorneys—can sue a company for its deceitful business practices without having to show personal harm. Instead, they can file a company as a representative for the public.

In 2018, this method was also tested in court, with the exact same company. Facing pressure from 51 Attorneys General—one for each US state and one for Washington, D.C.—Uber paid $148 million to settle a lawsuit alleging the company’s misconduct when covering up a data breach two years earlier.

Despite this success, waiting around for overworked government attorneys to file a lawsuit on a user’s behalf is not a practical solution to protecting online privacy. So, many users have turned to something else—technology.

Consumer beware? Consumer prepared

As online tracking methods have evolved far past the simpler days of just using cookies, consumers have both developed and adopted a wide array of tools to protect their online behavior, hiding themselves from persistent advertisers.

Paul Stephens, director of policy and advocacy for Privacy Rights Clearinghouse, said that, while the technology of tracking has become more advanced, so have the tools that push back.

Privacy-focused web browsers, including Brave and Mozilla’s Firefox Focus, were released in the past two years, and tracking-blocking browser extensions like Ghostery, Disconnect, and Privacy Badger—which is developed by EFF—are all available, at least in basic models, for free to consumers. Even Malwarebytes has a browser extension for both Firefox and Chrome that, along with obstructing malicious content and scams, blocks third-party ads and trackers that monitor users’ online behavior.

Stephens said he has another philosophy about protecting online privacy: Never trust an app.

“We have this naïve conception that the information we’re giving an app, that what we’re doing with that app, is staying with that app,” Stephen said. “That’s really not true in most situations.”

Stephens pointed to the example of a flashlight app that, for no discernible reason, collected users’ contact lists, potentially gathering the phone numbers and email addresses for every friend, family member, and met-once-at-a-party acquaintance.

“Quite frankly,” Stephens said, “I would not trust any app to not leak my data.”

Corporate respect for consumer privacy

There is one last pillar in defending consumer privacy, and, luckily for many users, it’s a sturdy one: corporations.

Yes, we earlier criticized the many nameless companies that window-dress themselves in empty privacy promises, but, for years, several companies have emerged as meaningful protectors of user privacy.

These companies include Apple, Signal, Mozilla, WhatsApp, DuckDuckGo, Credo Mobile, and several others. They all make explicit promises to users about not selling data or giving it to third parties that don’t need it, along with sometimes refusing to store any user data not fundamentally needed for corporate purposes. Signal, the secure messaging app, takes user privacy so seriously that the company cannot read users’ end-to-end encrypted messages to one another.

While many of these companies are household names, a smaller company is putting privacy front and center, and it’s doing it for a much-needed field—DNA testing.

Helix DNA not only tests people’s genetic data, but it also directs them to several partners who offer services that utilize DNA testing, such as The Mayo Clinic and National Geographic. Because Helix serves as a sort of hub for DNA testing services, and because it works so closely with so many companies and organizations that handle genetic data, it decided it was in the right position to set the tone for privacy, said Helix senior director of policy and clinical affairs Elissa Levin.

“It is incumbent on us to set the industry standards on privacy,” Levin said.

Last year, Helix worked with several other companies—including 23andMe, Ancestry, MyHeritage, and Habit—to release a set of industry “best practices,” providing guidance on how DNA testing companies should collect, store, share, and respect user data.

Among the best practices are several privacy-forward ideas not required by law, including the right for users to access, correct, and delete their data from company databases. Also included is a request to ban sharing any genetic data with third parties like employers and insurance companies. And, amidst recent headlines about captured serial killers and broad FBI access to genetic data, the best practices suggest that companies, when possible, notify individuals about government requests for their data.

Helix itself does not sell any user data, and it requires express user consent for any data sharing with third parties. Helix also brought in privacy executive and current head of data policy at the World Economic Forum Anne Toth to advise on its privacy practices before even launching, Levin said.

As to whether consumers appreciate having their privacy protected, Levin said the proof is not so much in what consumers say, but rather in what they don’t say.

“The best way to gauge that is in looking at the fact that we have not gotten negative feedback from users or concerns about our privacy practices,” Levin said. She said that any time a company is in the news for data misuse, there is never a large uptick in users reflexively walking away, even though Helix allows users to remove themselves from the platform.

Consumer privacy is the future

Online privacy matters, both to users and to companies. It should matter to lawmakers, but in the US, it has taken Congress until barely last year to take substantial interest in the topic.

Until the US has a comprehensive data privacy law, consumers will find a way to protect themselves, legal framework or not. Companies should be smart and not get left behind. Not only is protecting user privacy the right thing to do—it’s the smart thing to do.

ABOUT THE AUTHOR

David Ruiz

Pro-privacy, pro-security writer. Former journalist turned advocate turned cybersecurity defender. Still a little bit of each. Failing book club member.