What Federal Legislators Can Learn From California’s New Ballot InitiativeNovember 7, 2019
On January 1, 2020, the nation’s strictest privacy law, the California Consumer Privacy Act (CCPA), will take effect. The law empowers consumers to (1) be informed about what personal information a company has collected about them; (2) delete that data; and (3) opt out of companies selling that data to third parties. On top of this, there’s an additional ballot initiative that’s been introduced that could further strengthen California’s privacy protections.
California allows residents to circumvent the traditional legislative process and propose legislation for a ballot, to be voted on during an election, provided residents get enough signatures from registered voters. A wealthy real estate developer named Alaistair MacTaggart took advantage of this “statewide initiative” process to drive what later became CCPA by soliciting signatures for a particularly strong privacy proposal. In order to keep this strong initiative off the ballots, legislators quickly modified it to be less stringent and passed it through traditional legislative means.
Now MacTaggart is once again trying to get stronger privacy protections for Californians. In September, he introduced yet another ballot initiative, to be voted on next year, that would add additional consumer protections to CCPA. Lawmakers on Capitol Hill are working to pass federal privacy legislation, and they can look toward California’s proposals to shape their discussions. With that in mind, this blog post outlines what portions of the ballot initiative Congress could copy, and what parts it should improve or steer clear of.
The ballot initiative, called the California Privacy Rights and Enforcement Act (CPRA) of 2020, would add to CCPA’s existing privacy protections by:
- Allowing consumers to request that businesses correct inaccurate information about them (CCPA only allows consumers to request that their data be deleted)
- Precluding large businesses from collecting more data than they need for the purposes for which they are collecting it (a.k.a. data minimization)
- Precluding large businesses from collecting any data on minors under 16 without consent
- Allowing consumers to opt out of using sensitive personal information for advertising or marketing
- Precluding large businesses from selling an adult’s sensitive personal information without express consent
- Establishing the California Privacy Protection Agency to implement and administratively enforce the CCPA
There are a handful of things in this ballot initiative that would be great for the nation as a whole if they were introduced into federally enacted privacy legislation.
Data Correction: CCPA allowed Californians to request that companies delete their data. CPRA takes this a step further and would allow Californians to also request corrections of inaccurate data. Inaccurate data has been shown to cause a wide range of harms. One woman said she was repeatedly shown motherhood-related advertisements after her stillbirth, because algorithms likely determined that she was pregnant. In another case, a teenager’s father found out his daughter was pregnant after Target analyzed his daughter’s shopping patterns, determined she was pregnant, and sent her advertisements for maternity clothing.
In addition to receiving upsetting advertisements, consumers could face other significant harms because of rampant consumer scoring. Companies create multiple “scores” about consumers. This can include, of course, a credit score, which gauges a consumer’s creditworthiness, but can also include a “medication adherence” score that predicts how likely you are to follow a doctor’s orders, or a “churn score” that predicts how likely you are to switch your business to a competitor. These scores can all be used to help, or harm, consumers. Those with low medication adherence scores may benefit from medication reminders from their doctors, but consumers unlikely to leave a company for a competitor are likely to pay higher rates. Moreover, a consumer with an inaccurate credit report that leads to a low credit score could face a bevvy of harms, like being rejected for an apartment lease, being denied a cell phone contract, or even being denied employment in certain industries. And the possibility of these harms is greater than you might think – the Federal Trade Commission found 26% of people have at least one error on at least one of their three credit reports. (Because of these particularly problematic harms, Congress allows consumers to dispute and correct their credit reports.)
The simple ability to correct what businesses have wrong is a big step towards mitigating those harms. That’s why it’s essential that any federal legislation allow consumers to correct inaccurate data.
Browser Settings Opt-Out: The CCPA allows consumers to opt out of businesses selling their personal information, and the CPRA explicitly allows technology, like a browser setting, to opt out on the consumer’s behalf. Although we prefer opt-in mechanisms, the ability to utilize technological opt-outs could minimize the consumer burden and ensure that consumers are actually able to utilize their opt-out ability. Relatively few consumers have the time to individually opt out of the use of their data with each company with which they do business, as there could be dozens, or even hundreds. That’s why a provision that allows consumers to simply change the setting on their browser and be done with it is so important in ensuring the use of opt-out. If federal legislation provides consumers with the ability to opt out of the use of their data (although, again, we would prefer that consumers need to opt in), a similar provision in federal legislation could ensure that consumers are poised to take advantage of any opt-out provisions.
The Sale of Data: The CCPA allows users to opt out of businesses selling their information, and the CPRA requires businesses to get consent before selling sensitive personal information. Ultimately, these tools give users considerably more control over what companies their data is sold to than they previously experienced. That said, we believe that businesses should not be able to sell consumer data without consent, full stop. It shouldn’t matter if the data is sensitive or not.
As we noted in our analysis of the CCPA, the distinction between sensitive and non-sensitive information is an illogical distinction in today’s world. Non-sensitive information – like social media “likes” — can be just as personally identifiable as someone’s Social Security Number. Think about it: If you just saw a list of posts a social media account had liked you might very well realize it’s your best friend. Why? Because those likes are something only they would like – the tweets of your mutual friends, and that obscure foreign pop band your best friend is obsessed with.
Additionally, the use of non-sensitive information can be just as harmful as the use of sensitive information. The data brokerage industry is a multi-billion dollar industry selling more than just consumers’ names and contact information. For example, data brokers have sold the insurance industry information about consumers’ television habits, social media posts, and online browsing. An insurance company could use this information to determine its rates, and not always to the benefit of the consumer. According to the FTC, an insurance company might infer that a consumer whose online activity shows an interest in motorcycles engages in risky behavior. Risky behavior = higher insurance rates. It wouldn’t matter if that consumer only liked to look at motorcycles online, and had never actually ridden one.
Given that consumers face real harms from the sale of both their sensitive and less-sensitive data, we believe a federal privacy bill should take the CPRA’s recommendations a step further. It should require opt-in consent for the sale of all types of data – not just sensitive data – as does the Online Privacy Act of 2019, introduced by Representatives Anna Eshoo and Zoe Lofgren.
Data Collection and Minimization: The CPRA would only allow businesses to collect “personal information that is reasonably necessary to achieve the purposes for which it is collected.” We have long advocated for data minimization in any federal privacy legislation because by limiting data collection, businesses can ensure that consumer data is more secure and less likely to be misused. Thieves cannot steal data that is not collected. Moreover, when companies collect less data, it’s more difficult for them to engage in unfair discrimination. For example, if a business doesn’t collect consumers’ ZIP Codes, it can’t charge consumers different prices for the same item based on their ZIP Code, with wealthier areas sometimes paying lower prices.
All that said, ideally a federal bill would go further by also setting standards for data retention – or how long the data is kept. Those standards should provide that businesses only keep data as long as is necessary to accomplish the purpose for which the data is collected. Reducing data collection and retention would not just benefit consumers, but could also save businesses money by reducing the likelihood of a multi-million dollar data breach.
Private Right of Action: The CPRA maintains the CCPA’s private right of action for data breaches – meaning that consumers can sue companies for data breaches. However, the CPRA fails to implement a private right of action for other harms caused by violations of this act – like a failure to minimize data collection. A private right of action for violation of this law, if enacted, is important because the agencies tasked with enforcing this legislation likely won’t have the resources to enforce every case.
On the federal level, the FTC usually only enforces high-profile cases with broad implications due to capacity concerns. Enforcement of only cases that have broad implications with a narrow private right of action would lead to uneven enforcement, and provide some consumers with the opportunity for justice while denying that same opportunity to others. However, it is illogical to argue that the victim of a data breach is somehow more worthy of seeking justice than the victim of the illegal sale of personal data. Without a broad private right of action in CPRA and a federal privacy law, consumers may not have the opportunity to have all of their harms redressed.
Business, Defined: The CPRA outlines a business’s responsibility for collecting and handling consumer data. Unfortunately, in an effort to protect small businesses, the CPRA actually appears to protect small, medium, and fairly large businesses from taking action under this ballot measure. It defines a business as one that either (1) has gross revenue over $25 million; (2) annually buys or sells the data of 100,000 or more consumers; or (3) derives half or more of its revenue from selling consumer data. According to an analysis of U.S. Census Bureau data, only 26% of small businesses have revenue of over $1 million dollars, and the median revenue of small business is $390,000. Thus, it seems likely that the CPRA is excluding businesses that can, and should, be protecting their customers’ data from having any obligations under this ballot measure. A federal privacy law should have a narrow carve-out for small businesses not dealing primarily in data – not a gaping carve-out for all but the biggest businesses.
Obligations Effective in 2020: If enacted, the CPRA would only apply to personal information collected on or after January 1, 2020. While this type of start date is standard in legislation, ultimately it means that businesses can do whatever they want with the decades’ worth of data they already have. We believe this does not go nearly far enough in protecting consumers. A federal privacy law should apply to all consumer data, not just the data produced on or after the effective date of the law.
The California Privacy Protection Agency: The CPRA would create the California Privacy Protection Agency (CPPA), intended to implement and enforce the CCPA. There should be an agency implementing and enforcing privacy legislation to ensure that businesses comply with their legal obligations and to set regulation for pieces of the law that need clarification.
However, we don’t know how a new agency will work in California. One obstacle towards the CPPA’s effectiveness is that staff members will only be paid $100/day (or about $24,000/year – which is under the poverty line for a family of four). This low pay indicates one of two things may be happening. Either California doesn’t view the CPPA as a full time role – in which case, I have a hard time believing it will be able to effectively implement and enforce the CCPA, or, it doesn’t want to pay a living wage to employees, which is a whole separate problem.
At the federal level, whether it is the Federal Trade Commission, or some other agency dedicated to privacy, federal enforcement must be handled by a regulator with broad rulemaking authority. Such authority would provide regulatory flexibility on a number of issues that are best handled through regulation rather than statute, like data portability and data security. The regulator must also be given adequate resources, including staffing with technologists, and its authority to impose remedies for violations should be clear, prophylactic, and unrestricted.
Moar Regulations Please: The CPRA has a number of provisions that would benefit from regulation by the newly-created California Privacy Protection Agency. Although the bill does not explicitly grant the agency the ability to regulate these matters, they could fall under the agency’s jurisdiction. Examples of items mentioned in the CPRA that could benefit from regulation are:
- An appeals process if a business rejects a consumer’s request to delete or correct their information
- What the minimum ‘n’ size is for aggregate data (or the minimum number of people who must be grouped together for data to be considered aggregate)
- The definition of “notice”
If enacted, the CPRA could make privacy laws in California even more protective of individual privacy rights. Some of the ways it would do that are excellent, while others are misguided. With federal privacy legislation at the forefront of conversations about privacy, we hope that federal policymakers will learn from California’s experiences.
About Jenna Leventoff
Jenna Leventoff is a Senior Policy Counsel, where she focuses on promoting Public Knowledge’s mission through government affairs. Prior to joining Public Knowledge, Jenna served as a Senior Policy Analyst for the Workforce Data Quality Campaign (WDQC) at the National Skills Coalition, where she led WDQC’s state policy advocacy and technical assistance efforts on state data system development and use. She also served as an Associate at Upturn, where she analyzed the civil rights implications of new technologies, and as Manager and Legal Counsel of the International Intellectual Property Institute, where she led the organization’s efforts to utilize intellectual property for international economic development. Jenna has also held internships with the American Civil Liberties Union and Senator Sherrod Brown (D-OH). Jenna received her J.D, cum laude, and B.A from Case Western Reserve University. In her free time, Jenna enjoys yoga, international travel, and experimenting with new recipes.