The Year Ahead: Privacy Developments in 2022

Now that 2021 came to a close, what does our crystal ball predict for privacy developments in 2022? Here’s a quick rundown.

Law and policy developments

In 2022, expect an avalanche of new laws and regulations, attempting to govern and impose order on a dizzying array of tech developments. New regulatory efforts will range from data protection laws in India and China to AI regulation in the EU to automated decision making rules in US states. Add to that a flurry of enforcement activities, and you get a perfect storm of tech regulation.

United States

Federal privacy law

Despite the bipartisan and bicameral consensus around the need for a federal privacy law, and even concerning the vast majority of its language, don’t expect a legislative breakthrough in 2022. In part, this is because expecting breakthroughs in Congress has long been an exercise in futility, particularly as we head into another fraught and polarizing election cycle. But in addition, interest groups on both sides of this issue remain deeply entrenched, with business groups strongly resisting mechanisms of redress beyond regulatory action and advocacy groups pushing to expand the scope of privacy law to address topics such as equity, bias and discrimination.

State privacy laws

In addition to the California Privacy Rights Act (CPRA) and Virginia and Colorado’s new laws, which come into force over the next 18 months, expect a half dozen new states to pass privacy legislation. The Future of Privacy Forum’s state law expert Stacey Gray predicts new privacy legislation to mature in Maryland, Oklahoma, Ohio, New Jersey, Florida and Alaska. The big question is whether any of these laws will diverge from the existing framework, and in particular, will any of them present a private right of action (PRA).

A couple of years ago, conventional wisdom was that the more states pass privacy laws, the greater the pressure will build on the business community, and consequently Congress, to pass preemptive federal legislation. Absent a PRA, however, an interesting dynamic may develop, where the more state privacy laws, the less appetite businesses – who are growing accustomed to comply with the emerging (PRA-less) state framework –have for federal preemption.

FTC rulemaking

Under the leadership of new Chair Lina Khan, the FTC has issued strong statements and strategic plans for broad rulemaking efforts, including rules to curb “abuses stemming from surveillance-based business models” and “lax security practices” and to “ensure that algorithmic decision-making does not result in unlawful discrimination.” The FTC’s Republican Commissioners expressed heated dissent to these plans, warning of regulatory overreach and the agency pursuing what they depict as a legislative agenda.

In part, Khan’s plans hinge on the speed by which the appointment of the third Democratic Commissioner, Alvaro Bedoya, proceeds in Senate. Currently, the appointment process faces partisan deadlock as it heads for a full Senate vote. Even after the Democrats consolidate their majority, the rulemaking process is notoriously complicated and may run into litigation headwinds. In short, don’t hold your breath until privacy and security rules emerge from the FTC. At the same time, Congress may this year award the FTC an additional budget of up to $500 million, as well as the critical authority to impose penalties for first time violations of Section 5 of the FTC Act. (Although that legislative process too has now stalled.)

The White House joins the fray

Late in 2021, the Biden Administration began to launch policy initiatives around privacy, artificial intelligence and algorithmic decision making. This includes an Effort to Create A Bill of Rights for an Automated Society as well as an NTIA series of Listening Sessions on Personal Data: Privacy, Equity, and Civil Rights. Even short of pushing for federal privacy law, the White House can exert influence and advance important policymaking initiatives in this space.


GDPR enforcement

Following two years of sparse activity, 2021 featured a clear uptick in GDPR enforcement across the EU. This included headliner cases against Amazon (a 746 million euro fine in Luxembourg), Facebook and WhatsApp (a 250-page decision of the Irish DPC on WhatsApp and a pending decision on Facebook, as well as a case sent from an Austrian court to the CJEU) and the IAB (a decision of the Belgian DPA undermining the IAB’s Transparency and Consent Framework).

Importantly, regulators are expanding the lens from an early focus on data breaches to challenging companies’ legal bases for processing data and, notably, crossborder data flows. In 2022, expect an additional step up the enforcement ladder. We expect regulators to focus on issues such as protecting children’s data, restricting the use of sensitive health and financial information, and curbing the excess of digital marketing.

A wave of legislation

Even as companies are coming to terms with GDPR, another wave – some would say tsunami – of legislation is building up in Brussels and bound to hit shore this year. The Digital Services Act (DSA), Digital Markets Act (DMA), Data Governance Act (DGA), e-Privacy Regulation (ePR), Network and Information Security (NIS) Directive (NIS II) – a veritable alphabet soup of tech regulation affecting digital platforms, digital services, online marketing, data intermediaries and more, is materializing and set to become law on the books. In addition, companies and industry groups are keeping their eyes on the AI Act, which has broad implications for algorithmic decision making across the economy, as well as the Data Act, which expands legal obligations, including crossborder transfer restrictions, to non personal data.

EDPB guidance

The EDPB issued several important guidance documents last year, notably the highly anticipated draft opinion on the meaning of a “transfer” and the interaction between the GDPR’s crossborder transfer restrictions and its expanded extraterritorial scope. This year, expect the EDPB to opine on the all-important concepts of de-identification and anonymization. The Article 29 Working Party’s 2014 opinion on anonymization techniques set a high – some would say impractical – bar for processing various forms of de-identified information. A central question now is to what extent will the EDPB adopt a more liberal approach to acknowledge the key role of pseudonymization for business models and research settings.

In 2022, the EDPB is also likely to tackle the use of data for research purposes. Providing researchers with access to platform data and repurposing data – including health information – for research purposes raise some of the most complicated questions in data protection. This includes analysis of legal bases for processing, the GDPR Article 89 research exemption, and Member State laws governing scientific research. Last but not least, expect regulators to focus on children’s privacy. As 2021 drew to a close, the Irish DPC published its draft Fundamentals for a Child-Oriented Approach to Data Processing. This comes in addition to the UK ICO’s Age appropriate design code of practice for online services. The EDPB may also address children’s privacy this year, though its opinion may be published only in 2023.

Schrems II fallout

While negotiations between the US government and the European Commission continue toward a new Privacy Shield, there is a general sense of weariness among observers. A persistent gap continues to separate the parties on issues such as judicial redress, with US standing doctrine and limitations of Article III courts impeding the acceptance of a deal by the EU. Experts are concerned that even if the parties strike a new deal, such a deal would be vulnerable to judicial scrutiny and could ultimately be struck down for the third time by the CJEU. Some argue that in light of the recent EDPB opinion on data transfers, which negates the need to implement a transfer mechanism in cases of direct data collection by entities outside the EU, the necessity for a new Privacy Shield is greatly reduced.


Implementing China’s PIPL

China’s Personal Information Protection Law (PIPL) came into force on November 1, 2021, less than three months after its passage by the National People’s Congress. Like GDPR, the law has extraterritorial effect, triggering multinational companies in tech, retail, luxury goods, automotive, finance and additional sectors to launch comprehensive compliance programs.

It’s important to view PIPL in the broader context of tech regulation in China. Over the past two years, multiple regulatory agencies in China, including the Cyberspace Administration of China (CAC), which is charged with enforcing PIPL, have weighed into the area with new rules on data security, data transfers, artificial intelligence, and more. In addition, the government launched enforcement actions against companies ranging from small app developers to major household names in Chinese tech. For global businesses, one of the main challenges is compliance with a slew of rules on crossborder data transfers. Chinese law imposes data localization requirements on certain sectors – and categories of data, while other companies can export data from China but only under certain conditions such as conducting a security assessment and filing it with the CAC.

Introducing India’s Personal Data Protection Bill

The last week of 2021 saw India, itself a top-10 global economy, table its long awaited privacy bill. In addition to its own flavor of data localization and transfer restrictions, India’s Joint Parliamentary Committee introduced several novel concepts, including protections for non personal data. Experts expect Lok Sabha, India’s parliament, to pass the law in 2022, starting yet another cycle of implementation for the numerous global businesses, including major call centers and outsourcing operations, with activities in the subcontinent.

Technology and business trends


In 2022, crypto applications will hit the mainstream. The term Web3 refers to a new generation of the Internet run on blockchains and characterized by decentralization. Unlike the Web 2.0, which featured large corporate platforms hosting activities by content producers and consumers, the Web3 is decentralized, distributed, token mediated and participant controlled. Applications live on a blockchain, which, taking Ethereum as a model, is composable and interoperable, allowing innovative new layers of products and services. In addition to digital currencies, decentralized finance (DeFi), and decentralized autonomous organization (DAOs) the white hot use case is non-fungible tokens (NFTs). The Financial Times reported that in 2021 NFTs became a $40 billion global market. Last March, British auction house Christies sold an NFT for a digital work of art for $69 million.

Is Web3 amenable to privacy and data protection compliance? On the one hand, with its decentralized, open source architecture mediated by cryptography, Web3 is music to privacy advocates’ ears. From the early days of the Internet, privacy champions fought against centralized systems, which provide immense power to corporate and government data lords and put massive amounts of personal data at risk of mega breaches. Here, then, after endless frustration about the non-linear march of privacy policy, one step forward two steps back, a technological trend comes to privacy’s rescue.

On the other hand, alas, Web3 raises privacy challenges of its own. By design, the blockchain is open, transparent, immutable, replicable and provable. If you had concerns about your bank or e-commerce platform “seeing” your data, now the entire world will be able to see it on a public ledger. Moreover, if you ever set up a digital wallet, for most intents and purposes a digital identity for Web3, you know how much personal data is required to clear KYC and other regulatory hurdles. (Some platforms, such as Monero, try to provide greater anonymity, making it harder to link data to a fixed identity, trace funds or observe transaction size.)

Generally, GDPR rights to erase or amend data don’t align well with transactions on a blockchain, which are immutable, that is, can never be deleted, amended or otherwise tampered with. Of course, information on the blockchain is pseudonymous; indeed, bitcoin visionary Satoshi Nakamoto remains pseudonymous to this day. But according to current EDPB guidance, pseudonymized data is subject to GDPR.

The metaverse

The metaverse struck a chord last year, when Facebook rebranded itself as Meta, highlighting the tech platforms focus on the virtual landscape envisioned by Neal Stephenson in his 1994 book, Snow Crash. Simply put, the metaverse is for virtual worlds what the Internet is for websites. It’s a series of interconnected, interoperable and immersive ecosystems. Like the Internet, the metaverse means different things to different people and organizations, who view it as a platform for work, business, gaming, entertainment, social interactions, or all of the above.

The metaverse too deploys Web3 tools, such as blockchains, cryptocurrencies and NFTs, to facilitate trading in anything from virtual plots of land to avatars and game outfits. What we already know is that regardless of its nature, the metaverse will trigger privacy questions around AR/VR (see this FPF report), biometrics, computer-brain interfaces (FPF report here), child protection, and more. As in any virtual space, policymakers and engineers will need to draw a delicate line between privacy and accountability. As the New York Times reported recently, “Harassment, assaults, bullying and hate speech already run rampant in virtual reality games.”

Digital marketing

With all due respect to the Web3 hype, online platforms aren’t resting on their laurels just yet. Of course, advertising will remain an important part of the metaverse too, as consumer giants like Nike and Adidas already made inroads into Web3 last year. But the architecture for targeting ads is in flux, with platforms such as Apple and Google phasing out browser and device side tracking and measurement tools. Talk about the demise of the third party cookie feels like old hat; but little by little marketing teams are adjusting to a new ecosystem anchored in walled gardens and server side technologies.

In 2022, expect this trend to continue. The ad tech ecosystem will try to solve the puzzle comprising Apple’s ATT and SKAdNetwork and Google’s FLOC and FLEDGE proposals, on the one hand, and digital marketing regulations from Europe’s GDPR, e-privacy regulation and DSA to California’s CCPA and CPRA, on the other hand. To keep up, the industry will have to come up with new models for ad targeting, measurement and attribution.

Artificial Intelligence

Artificial intelligence continues to occupy a central role for policymakers across the globe. As automated decision making systems proliferate, questions abound concerning transparency, privacy, due process, fairness and equity. Last year, the White House Office of Science and Technology Policy (OSTP) and the National Science Foundation (NSF) formed a National Artificial Intelligence (AI) Research Resource Task Force to propose a road map for expanding access to critical resources and educational tools that will spur AI innovation and economic prosperity nationwide. The FTC too has announced it would weigh into the space with rulemaking intended to “ensure that algorithmic decision-making does not result in unlawful discrimination.”

The states are stepping up legislative efforts around AI. In California, for example, the CPRA addresses “automated decision-making technology, including profiling,” noting that new implementing regulations should specify how CPRA access and opt-out rights apply in this context. Additional legislation, such as the Automated Decision Systems Accountability Act of 2021 (AB 13), has been tabled in the California assembly. Bills pending in states from Washington and Colorado to New Jersey and Vermont feature similar language. The EU, meanwhile, is continuing to advance its vision of an AI Regulation, which is based on concepts from the field of product liability. And in China, the Cyberspace Administration China (CAC) released draft guidelines seeking to regulate the use of algorithmic recommender systems by internet information services.


The COVID pandemic, and stunningly swift response from the research community in the form of mRNA vaccines, has brought to the fore the importance of data sharing for research purposes across organizations and geographies. We have learned that data sharing can quite literally save lives. This year, the EDPB will likely opine on the complex interaction between research exemptions in the GDPR and Member State laws, as well as some internal tensions in GDPR itself. In the US too, policymakers seek to advance evidence based rulemaking and the availability of data for healthcare research.


From tech trends to global policymaking efforts, 2022 promises to be a fascinating – and complicated – year for privacy and data protection. New laws, enforcement actions, litigation, and self regulatory initiatives will keep companies and counsel busy until next Christmas – and beyond.