Washington Privacy Act welcomed by corporate and nonprofit actors

Washington Privacy Act welcomed by corporate and nonprofit actors

The steady parade of US data privacy legislation continued last month in Washington with the introduction of an improved bill that would grant state residents the rights to access, control, delete, and port their data, as well as opting out of data sales.

The bill, called the Washington Privacy Act, also improves upon its earlier 2019 version, providing stronger safeguards on the use of facial recognition technology. According to some analysts, when compared to its coastal neighbor’s data privacy law—the California Consumer Privacy Act, which went into effect this year—the Washington Privacy Act excels.

Future of Privacy Forum CEO Jules Polonetsky called the bill “the most comprehensive state privacy legislation proposed to date.”

“It includes provisions on data minimization, purpose limitations, privacy risk assessments, anti-discrimination requirements, and limits on automated profiling that other state laws do not,” Polonetsky said.

Introduced on January 20 by state Senator Reuven Carlyle, the Washington Privacy Act would create new responsibilities for companies that handle consumer data, including the implementation of data protection processes and the development and posting of privacy policies.

Already, the bill has gained warm reception from corporate and nonprofit actors. Washington-based tech giant Microsoft said it was encouraged, and Consumer Reports welcomed the thrust of the bill, while urging for even more improvements.

“This new draft is definitely a step in the right direction toward protecting Washington residents’ personal data,” said Consumer Reports Director of Consumer Privacy and Technology Policy Justin Brookman. “We do hope to see further improvements to get rid of inadvertent loopholes that remain in the text.”

What the Washington Privacy Act would do

Like the many US data privacy bills introduced in the past 18 months, the Washington Privacy Act approaches the problem of lacking data privacy with two prongs—better rights for consumers, tighter restrictions for companies.

On the consumer side, the Washington Privacy Act would grant several new rights to Washington residents, including the rights to access, correct, delete, and port their data. Further, consumers would receive the right to “opt out” of having their personal data used in multiple, potentially invasive ways. Consumers could say no to having their data sold and to having their data used for “targeted advertising”—the somewhat inescapable practice that results in advertisements for a pair of shoes, a fetching sweater, or an 4K TV following users around from device to device. 

Consumers could exercise their rights with simple requests to the companies that handle their data. According to the bill, these requests would require a response within 45 days. If a company cannot meet that deadline, it can file for an extension, but it is required to notify the consumer about the extension and about why it could not meet the deadline.

Further, unfulfilled requests are not a dead end for consumers—companies must also offer an appeals process to the consumers whose requests they deny or do not fulfil. Requests must also be responded to free of charge, up to two times a year per consumer.

Perhaps one of the most welcome provisions in the bill is its anti-discrimination rules. Companies cannot, the bill says, treat consumers differently because of their choices to exert their data privacy rights. On the surface, that makes dangerous ideas like “pay-for-privacy” schemes much harder to enact.

Concerning new business regulations, the Washington Privacy Act separates the types of companies it applies to into two categories: “controllers” and “processors.” The two terms, borrowed from the European Union’s General Data Protection Regulation (GDPR), have simple meanings. “Controllers” are the types of entities that actually make the decisions about how consumer data is collected, shared, or used. So, a small business with just one employee who decides to sell data to third parties? That’s a controller. A big company that decides to collect data to send targeted ads? That’s a controller, too.

Processors, on the other hand, are akin to contractors and subcontractors that perform services for controllers. So, a payment processor that simply processes e-commerce transactions and nothing more? That’s a processor.

The Washington Privacy Act’s new rules focus predominantly on “controllers”—the Facebooks, Amazons, Twitters, Googles, Airbnbs, and Oracles of the world.

Controllers would have to post privacy policies that are “reasonably accessible, clear, and meaningful,” and would include the following information:

  • The categories of personal data processed by the controller
  • The purposes for which the categories of personal data are processed
  • How and where consumers may exercise their rights
  • The categories of third parties, if any, with whom the controller shares personal data

If controllers sell personal data to third parties, or process it for targeted advertising, the bill requires those controllers to clearly disclose that activity, along with instructions about how consumers can opt out of those activities.

Separately, controllers would need to perform “data protection assessments,” in which the company looks at, documents, and considers the risks of any personal data processing that involves targeted advertising, sale, and “profiling.”

The regulation of “profiling” is new to data privacy bills. It’s admirable.

According to the bill, “profiling” is any form of automated processing of personal data to “evaluate, analyze, or predict personal aspects concerning an identified or identifiable person’s economic situation, health, personal preference, interests, reliability, behavior, location, or movements.”

In today’s increasingly invasive online advertising economy, profiling is omnipresent. Companies collect data and create “profiles” of consumers that, yes, may not include an exact name, but still include what are considered vital predictors about that consumer’s lifestyle and behavior. 

These new regulations make the Washington Privacy Act stand out amongst its contemporaries, said Stacey Gray, senior counsel with Future of Privacy Forum.

“The big picture of the bill is that includes the same individual rights as the California Consumer Privacy Act—of access, sale, et cetera—and then more,” Gray said. “The right to correct your data, to opt out of targeted advertising, and out of profiling—that is further on the individual rights side.”

Gray added that the bill’s business obligations also go further than those in the CCPA, naming the data risk assessments previously discussed.

The Washington Privacy Act includes several more business obligations, all of which add up to meaningful data protections for consumers. For instance, companies would need to commit to data minimization principles, only collecting consumers’ personal data that is necessary for expressed purposes. Companies would also need to obtain affirmative, opt-in consent from consumers before processing any “sensitive data,” which is any data that could reveal race, ethnicity, religion, mental or physical health conditions or diagnoses, sexual orientations, or citizenship and immigration statuses.

But perhaps most intriguing in the Washington Privacy Act is its regulation of facial recognition technology.

Facial recognition provisions

In 2019, Washington state lawmakers crafted a bill aimed at improving the data privacy protections of consumers. They called it… the Washington Privacy Act. That original bill, which has now been substituted the 2020 version, included provisions on the commercial use of facial recognition.

On its face, the new rules looked good: Companies that used facial recognition tech for commercial purposes would have to obtain consent from consumers “prior to deploying facial recognition services.”

Unfortunately, the original bill’s very next sentence made that consent almost meaningless.

According to that bill, consumer “consent” could be obtained not by actually asking the consumer about whether they agreed to having their facial data recorded, but instead, by posting a sign on a company’s premises.

As the bill stated:

“The placement of conspicuous notice in physical premises or online that clearly conveys that facial recognition services are being used constitute a consumer’s consent to the use of such facial recognition services when that consumer enters those premises or proceeds to use the online services that have such notice, provided that there is a means by which the consumer may exercise choice as to facial recognition services.”

The length of the explainer is as broad as the exception it allows.

This loophole upset several privacy rights advocates who, in February 2019, sent a letter to key Washington lawmakers.

“[W]hile the bill purportedly requires consumer consent to the use of facial recognition technology, it actually allows companies to substitute notification for seeking consent—leaving consumers without a real opportunity to exercise choice or control,” the letter said. It was signed by Consumer Reports, Common Sense, Electronic Frontier Foundation, and Privacy Rights Clearinghouse.

The 2020 bill closes this loophole, instead requiring affirmative, opt-in consent for commercial facial recognition use, along with mandatory notifications—such as signs—in spaces that use facial recognition technology. The new bill also requires processors to open up their data-processing tools to outside investigation and testing, in an effort to root out what the bill calls “unfair performance differences across distinct subpopulations,” such as minorities, disabled individuals, and the elderly.

Moving the Washington Privacy Act forward

Despite the 2019 Washington Privacy Act gaining swift approval in the Senate two months after its January introduction, the bill ultimately failed to reach the House. Multiple factors led to the bill’s failure, including the bill’s definitions for certain terms, its approach to enforcement, and its treatment of facial recognition.

Some of those same obstacles could come up for the 2020 bill, Gray said.

“If this bill does not pass this year, that’s where we might see a source of conflict—is either with the facial recognition provisions, or with enforcement,” Gray said. For enforcement to take hold, Gray said the Attorney General’s office—tasked with regulation—will need increased funding and staffing. Further, there will likely be opposition to the bill’s lack of “private right of action,” which means that consumers will not be able to individually file lawsuits against companies that they allege violated the law. This issue has been a sticking point for data privacy legislation for years.

Still, Gray said, the bill shows improvement from its 2019 version, which could help push it forward.

“All things aside,” Gray said, “we’re more optimistic than last year about it passing.”


David Ruiz

Pro-privacy, pro-security writer. Former journalist turned advocate turned cybersecurity defender. Still a little bit of each. Failing book club member.