Navigating the Complex Regulation of Privacy and Data Protection

January 11, 2022


Privacy Compliance Remains Top of Mind in 2022

For those following data privacy and consumer data protection trends, it should come as no surprise that enacting comprehensive legislation to regulate companies’ use of personal data has continued to be a focal point both internationally and in the U.S., at the federal, state and local levels. 

In the last three years, over 10 federal proposals and over 40 state proposals for comprehensive privacy legislation were introduced across the U.S., and we expect this trend to continue well into 2022, given the growing bipartisan support for legislation to protect consumer interests and mitigate against the risks associated with the digital economy. The ever-changing landscape and patchwork of compliance obligations globally will only continue to grow more complex and costly, and may lead to increased regulatory scrutiny and potential enforcement actions despite best compliance efforts.  In the U.S., without comprehensive federal data privacy legislation, businesses remain subject to numerous state laws with ambiguous and sometimes conflicting legal obligations. Trans-Atlantic and other international data flows will only continue to become increasingly difficult and costly to navigate in light of recent developments, including in China, the UK  and the European Union.

Therefore, in 2022 companies will continue to require attention to and understanding of distinct, and at times conflicting, data privacy regimes both within the U.S. and in other jurisdictions, and thus it is of critical importance that boards and management ensure compliance with existing legislation and continue to monitor impending new developments both domestically and internationally.

Increasing State Legislation:  Virginia and Colorado

Building upon the framework established by the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA), Virginia and Colorado became the second and third states respectively to adopt comprehensive consumer data protection laws, imposing new obligations on businesses and providing residents with new rights regarding the collection and processing of their personal data. While the Virginia Consumer Data Protection Act (VCDPA) and the Colorado Privacy Act (ColoPA, each an “Act” and collectively the “Acts”) largely track the requirements under the CCPA and CPRA, each Act imposes some significant new requirements on businesses, such as mandatory data security assessments for certain processing activities, and provides consumers with new and enhanced rights and protections, such as correction rights, that companies will need to take note of and begin to work into their existing data protection policies.

Scope.  The VCDPA, effective January 1, 2023, and the ColoPA, effective July 1, 2023, each apply to entities or persons that (1) conduct business in each respective state or produce products or services targeted to its residents and (2) either control or process (a) the personal data of 100,000 state residents or more during a calendar year or (b) the personal data of at least 25,000 state residents and derive gross revenue from the sale of personal data.  Unlike the CCPA, the thresholds for applicability in the VDCPA and ColoPA are more narrowly and geographically targeted, meaning an entity must collect data from a large number of local residents to be covered by the Acts, whereas an entity can be covered by the CCPA if it has a global annual revenue of $25 million and does business in California regardless of the number of California residents affected (as long as it collects personal data of one or more California residents).

General Consumer Rights.  The VDCPA and ColoPA provide residents with the right to access, correct and delete their data and to obtain a copy of their data in a portable and readily usable format.

Opt-Out and Opt-In Rights.  The VDCPA and ColoPA provide residents with the right to opt-out of uses of their data for certain purposes (e.g., for targeted advertising purposes, sales[1] of personal data or profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer).  Most notably, both Acts and the CPRA extend a consumer’s ability to opt-out of not only “sales” of their personal data, but also other instances of sharing of their personal data, an expansion that will likely require significant investment and changes to current operations of many businesses.  Additionally, like the CPRA, the Acts give consumers additional rights over “sensitive personal data” (including  data revealing racial or ethnic origin, religious beliefs, mental or physical health, sexual or biometric data); however, unlike the CRPA,[2] both the ColoPA and VDCPA require affirmative, GDPR-style consent (i.e. opt-in consent) before consumers’ sensitive personal data can be processed.

Additional Compliance Obligations. In addition to the aforementioned rights, the ColoPA and VCDPA place additional compliance obligations on covered entities, including the duty of data minimization and transparency, as well as obligations to post a privacy notice detailing the sources of and purposes for which their data is collected and processed and how it is shared. Furthermore, covered entities are required to:

(1) establish and maintain reasonable administrative, technical and physical security measures to protect data;

(2) conduct data security assessments for certain data processing activities that present a “heightened” risk of harm, such as profiling, selling personal data, processing sensitive personal data and engaging in targeted advertising; and

(3) enter into specific data processing agreements with service providers that set forth instructions and obligations with respect to processing performed by such service providers on behalf of such covered entities.

Enforcement.  Unlike the CCPA and CPRA, neither the VDCPA nor ColoPA provide consumers with a private right of action; instead, enforcement is maintained exclusively by each state’s attorney general or district attorneys, which can impose civil penalties of up to $7,500 for violations of the VDCPA or up to $20,000 per violation of the ColoPA.[3]  It is likely we will continue to see a trend away from the inclusion of a private right of action, as legislation in many states where a privacy law initially had strong support, failed in part due to disagreements amongst lawmakers on issues of enforcement (e.g., Florida and Washington).

Other U.S. Privacy Developments

  • Other U.S. State Laws on the Horizon. Currently, several states including Massachusetts, Minnesota, North Carolina and Ohio have active bills working through their respective state legislatures, and we expect that many other states will introduce proposed legislation. While many of these bills will likely contain common key provisions, such as rights afforded to consumers and required contractual and security standards, it is also likely that many will contain nuanced distinctions and further contribute to an increasingly complex compliance landscape. Accordingly, as states continue to contribute to the melting pot of privacy law compliance, the need for boards and management to continue to stay abreast of changes in the legal landscape and adapt their budget for compliance costs will expand significantly. 
  • Growing Support for Greater FTC Oversight and Enforcement. With the lack of progress made toward federal data privacy legislation, many have focused their attention on the Federal Trade Commission (FTC) and its potential to make an impact on consumer data protection regulation, with Lina Khan assuming her role as the Chair of the FTC, growing support for enhanced enforcement authority and an increase in funding for consumer protection agencies.  The Build Back Better Act, which passed the U.S. House of Representatives at the end of 2021 and moved to consideration by the U.S. Senate, provides the FTC with $1 billion over 10 years to fund a privacy enforcement bureau to address matters related to unfair or deceptive acts or practices relating to privacy, data security, identity theft, data abuses and related matters. Further, if passed, the FTC would have authority to file lawsuits in federal district court seeking monetary penalties of up to around $43,000 per violation of the FTC Act. While the Build Back Better Act makes its way through the legislative process, the FTC has signaled its intent to take matters into its own hands, filing an Advanced Notice of Proposed Rulemaking on December 10, 2021, to initiate a rulemaking process to curb lax security practices, limit privacy abuses and ensure that algorithmic decision-making does not result in unlawful discrimination.  
  • FTC Updates the Gramm-Leach-Bliley Act “Safeguards Rule.” In October 2021, the FTC adopted a new Gramm-Leach-Bliley Act Safeguards Rule (the New Rule) that became effective January 10, 2022 and imposes more stringent data security requirements on regulated financial institutions.  Most notably, the definition of “financial institution” has been expanded to include entities engaged in activities that the Federal Reserve Board determines to be incidental to financial activities. Thus, “finders,” or companies that bring together buyers and sellers of a product or service for transactions that the parties themselves negotiate and consummate, are within the scope of the New Rule.  The New Rule expands upon existing requirements to implement an information security program with compliance obligations that are appropriate to an institution’s size and complexity, the nature and scope of its activities and the sensitivity of any customer information it possesses.[4] Additionally, the New Rule requires the designation of a “qualified individual” responsible for implementing and overseeing the financial institution’s information security program and requires security awareness training.  Finally, the FTC has initiated a supplemental notice of proposed rulemaking to further amend the New Rule to require financial institutions to report to the FTC any security event in which the financial institutions have determined misuse of customer information has occurred or is reasonably likely and at least 1,000 consumers have been affected or reasonably may be affected.
  • Increased Focus on Protection of Children’s Personal Data. As reliance on the internet and digital connectivity continued to grow in the wake of the COVID-19 pandemic, 2021 sharply increased digital activity by children and gave rise to growing concern over the lack of comprehensive children’s privacy protections online.  In response, legislators have proposed updates to the Children’s Online Privacy Protection Act (COPPA) to fill in the gaps, including with proposals such as (i) the PROTECT Kids Act (H.R. 1781), which proposes to amend COPPA by expanding its scope to cover children up to the age of 16 (currently only 13) and services provided through mobile applications and (ii) the Protecting the Information of our Vulnerable Children and Youth Act (H.R. 4801), which would cover minors under the age of 18, prohibit targeted advertising to children, require privacy impact assessments for all covered entities and allow a private right of action for parents to bring claims on their children’s behalf.  While no legislation or amendments to COPPA have gained traction, legislators have called upon companies operating in industries that tend to attract children to consider complying with the U.K. Information Commissioner’s Office’s Age Appropriate Design Code, which came into effect in September and restricts the collection and use of personal data by online or connected products or services that are likely to be accessed by anyone under the age of 18 in the U.K.  In the interim, we expect to see increased regulatory scrutiny in the U.S. by the FTC for companies that collect or process children’s personal data in violation of COPPA.[5]

EU and UK Privacy Developments

  • New Standard Contractual Clauses for Cross-Border Data Transfers Under the GDPR. On June 4, 2021, the European Commission (the Commission) published its new standard contractual clauses (the New SCCs) for transferring personal data from the EU and European Economic Area (the EEA) countries to third countries, under the GDPR.  The previous set of standard contractual clauses (the Old SCCs) were subsequently repealed on September 27, 2021. The immediate effect of this on the contracts governing transfers of personal data out of the EEA depends on the timing of the entry into the contract:
  • Contracts entered into prior to September 27, 2021, which implement the Old SCCs, must be revised in order to implement the New SCCs by December 27, 2022;
  • Contracts entered into, or new processing operations undertaken, on or after September 27, 2021 must implement the New SCCs.

The key differences between the Old SCCs and the New SCCs are:

  • Instead of separate sets of SCCs for different transfer scenarios, the new SCCs consist of one set with certain common clauses and a specific ‘module’ for each of the following four identified transfer scenarios: (1) transfer from controller to controller (C2C); (2) transfer from controller to processer (C2P); (3) transfer from processor to processor (P2P); and (4) transfer from processor to controller (P2C). The provisions relating to P2P and P2C transfers were previously not available under the Old SCCs, and the introduction of these modules has resulted in greater clarity and legal certainty for organizations that transfer data internationally. 
  • In addition, companies using the New SCCs for C2P or P2P transfers  no longer have to enter into separate data processing agreements given that all the requirements of Article 28(3) of the GDPR (provisions to be included in data processing agreements) are covered by the New SCCs. 
  • Under the New SCCs, the parties must carry out a “transfer impact assessment,” which must be made available to the competent supervisory authority, upon request. This involves the parties considering the specific circumstances of the transfer in order to assess the risk associated with the transfer. Such considerations include, but are not limited to: the nature of the personal data, the type of recipient, the laws and practices in the third country and the purposes of the processing
  • Finally, the Old SCCs were bipartite agreements between two parties; there was no option for additional parties to join – this was a particular challenge for global organizations with large-scale intra-group or extra-group data transfers. The New SCCs address this issue through the “docking clause,” which allows additional data exporters or importers to accede to the agreement throughout its term by completing a new data transfer appendix.
  • Transfers of Personal Data from the UK After Brexit. The New SCCs do not apply in the UK following Brexit, which means that, for now, the Old SCCs continue to govern all transfers of personal data out of the UK. In August 2021, the UK Information Commissioner’s Office (UK ICO) launched a consultation on its own set of SCCs, which included a draft international data transfer agreement (IDTA) and a draft transfer risk assessment (TRA) tool. The UK ICO also released a template draft addendum (the Addendum), which amends the New SCCs so that they can be used for UK data transfers and EEA data transfers.  The consultation closed on October 7, 2021 and the finalized forms of the IDTA and TRA are expected in early 2022, as well as a decision on whether the Addendum can be used to amend the New SCCs. In the meantime, companies should continue to monitor developments for further guidance issued by the UK ICO.
  • UK Government’s Consultation to Reform the UK GDPR. Less than a year after the UK GDPR – which is substantially based on the EU GDPR – came into effect on January 1, 2021, the UK Government’s Department for Digital, Culture, Media and Sport (DCMS) launched a consultation in September 2021 detailing its proposals to reform the UK’s data protection regime.  With an aim to create “a more pro-growth and pro-innovation data regime whilst maintaining the UK’s world-leading data protection standards,” the DCMS seeks to remove what it considers to be unnecessarily complex or vague obligations under the UK GDPR and loosen restrictions on the use of data in order to spur growth and innovation.  The DCMS strives to reduce barriers to the use of data by UK businesses, in particular with respect to furthering innovation and research purposes. 

The consultation focuses, for example, on the use of data for AI projects.  To that end the DCMS proposed making it easier for entities to process personal data for the purposes of monitoring, detecting or correcting bias in relation to developing AI systems by explicitly listing it as a legitimate interest (one of the grounds for lawful processing), and by allowing entities to process sensitive personal data for this purpose.  The consultation invited views on removing Article 22 of the UK GDPR – which specifically affords data subjects the right not to be subject to a decision resulting from “solely automated” processing – and permitting the use of solely automated AI systems on the basis of legitimate interests or public interests.  The consultation also envisaged removing certain existing obligations under the UK GDPR, such as the requirements to appoint a data protection officer and to conduct data protection impact assessments.  Instead, the DCMS proposed to impose a requirement to implement a “privacy management programme” that would be “tailored to [the entity’s] processing activities and ensure data privacy management is embraced holistically” and allows different entities to “adopt different approaches to identify and minimize data protections risks that better reflect their specific circumstances.”  The consultation also suggests that the UK government is seeking to allow for easier cross-border transfers of personal data out of the UK. 

While the abovementioned proposals may lessen certain obligations under the UK GDPR, a departure of the UK GDPR from the EU GDPR may impact businesses with a foot in the EU as well. 

China’s Personal Information Protection Law Takes Effect

On November 1, 2021, China’s new data protection law, the Personal Information Protection Law (PIPL) took effect.  Similar to the GDPR, the PIPL has extra-territorial reach and will apply to both (i) the processing of personal information carried out within China and (ii) the processing of personal information of Chinese individuals by a foreign entity outside of China in which the purpose of such processing is to provide a product or service to individuals in China, to analyze or assess the activities or behavior of individuals in China or pursuant to other circumstances provided in laws and administrative regulations. 

There are certain similarities between the PIPL and the GDPR.  For instance, the minimum required content in privacy notices pursuant to the PIPL is very similar to that required under the GDPR, and the substance of the data processing agreement that is required under the PIPL if a third party is engaged to process personal information on behalf of the “Personal Information Handler” (which is similar to the concept of “data controller” under the GDPR – being the entity that independently determines the purposes and means of processing the personal data) is similar to that required under the GDPR.  As such, companies may be able to fairly easily adapt existing GDPR compliance programs and policies for compliance with certain aspects of the PIPL.

However, in other respects the PIPL introduces more stringent requirements than under the GDPR.  For example, “sensitive personal information” under the PIPL, for which there is a need to obtain “separate consent” for processing, is defined much more broadly than “special categories of data” under the GDPR, being “personal information that once leaked or illegally used would easily result in harm to the dignity of natural persons or to their personal safety or the safety of their property”, including “financial account information” (which is not a special category data under the GDPR).  Further, the PIPL appears to favor data localization, and attaches additional conditions to the transfer of personal information out of China.  Any entity contemplating cross-border transfers of personal information will need to obtain separate consent from the individual, enter into data processing addendum with the data importer/receiver (similar to standard contractual clauses under the GDPR), and, in certain circumstances, satisfy additional requirements such as undergoing a security assessment carried out by the Cyberspace Administration of China (CAC).  The exact requirements differ between operators of critical information infrastructure (CII) and non-CII operators, and depends on the type and volume of personal information processed.  Such security assessment and approval of the cross-border data transfer will only be valid for two years.  Most notably, the CAC would also have broad discretion to deny cross-border data transfers and require data localization by default.  

The introduction of the PIPL places additional obligations on businesses when processing personal information in China or in relation to Chinese residents (when providing them with products or services or analyzing their behavior), and may limit the ability of companies to transfer personal information outside of China.  Non-compliance with the PIPL can attract maximum fines of up to CNY 50 million (approximately US$ 7.85 million) or 5% of the previous year’s annual turnover, and orders to suspend the related business operations and/or to revoke the business permit or license of the entity.  The PIPL also provides for fines on any person in charge or other directly liable individual.  In early December 2021, it was reported that over 100 smartphone apps had been removed from China’s app stores due to violations of the PIPL and Data Security Law.  Therefore, while the exact requirements for compliance with the PIPL are unclear while certain ancillary rules are still to be clarified and draft measures supplementing the PIPL are awaiting passage into law, it is nonetheless critical to comply with the stringent requirements of PIPL. 

Key Takeaways

  • Though the CPRA, ColoPA and VCDPA do not come into effect until 2023, companies must begin to prepare to comply, as certain of these laws have lookback periods commencing as early as January 1, 2022. Further, companies should expect additional states to introduce and codify further data protection legislation and thus should keep abreast of legislative activity at both the state and federal levels.
  • Companies must monitor ongoing developments to remain compliant with new and amended privacy laws and regulations in areas relevant to their businesses; this includes, for example, companies that collect children’s data, financial institutions and entities carrying out activities that are incidental to financial institutions’ operations and businesses that transfer data across borders.
  • The New SCCs will require organizations engaged in cross-border data transfer out of the EEA to replace their Old SCCs, assess which of the New SCCs to use and carry out a transfer impact assessment.
  • Although there remains uncertainty around the exact requirements under the PIPL, in particular concerning cross-border transfers, companies must already comply with the PIPL as Chinese regulators have begun to clamp down on businesses found in violation.

More generally, as more and more countries around the world enact comprehensive privacy legislation, management and boards of companies that operate in multiple jurisdictions should pay particular attention to their compliance obligations, with an eye toward how these laws and regulations interact and diverge.

[1] Under the VDCPA, the definition of “sale” is limited to cover only “the exchange of personal data for monetary consideration by the controller to a third party,” as opposed to the language of the CCPA/CPRA and ColoPA which cover both “monetary or other valuable consideration”.

[2] The CPRA provides a limited opt-out approach which permits the processing of sensitive personal data as long as notice is provided and consumers are permitted to limit the use of their sensitive personal data where such data is used to infer characteristics about the consumer (i.e. a limited opt-out approach).

[3] Because a violation of the ColoPA is considered a deceptive trade practice, the penalties are governed by the Colorado Consumer Protection Act; thus, a noncompliant entity may be fined up to $20,000 per violation. 

[4] The New Rule enumerates certain elements that must be included, such as requirements to implement access controls, inventories to manage data, personnel, devices, systems and facilities, encryption for customer information in transit and at rest, secure development practices, multifactor authentication, information disposal procedures, change management, testing and incident response and to test and continuously monitor information systems with periodic penetration testing and vulnerability assessments.

[5] See, for example, a recent $2 million settlement between the FTC and OpenX Technologies Inc. (OpenX), an online advertising platform, that was alleged to unlawfully collected personal data from children under 13 without their parents’ permission. As part of the settlement, OpenX will also be required to delete all data it collected to serve targeted ads, implement a comprehensive privacy program to ensure that it complies with COPPA, and cease the collection and retention of personal data of children under 13.