Privacy, Practice, and Performance

Privacy law is at a crossroads. In the last three years, U.S. policymakers have introduced more than fifty proposals for comprehensive privacy legislation, most of which look roughly the same: they all combine a series of individual rights with internal compliance. The conventional wisdom sees these proposals as groundbreaking progress in privacy law and explains their uniformity by looking to catalyzing precedent like the General Data Protection Regulation in Europe or the California Consumer Privacy Act.

This Article challenges that emerging consensus. Relying on contemporary sociological and critical studies scholarship, this Article analyzes recent privacy proposals in the United States through their social practices and argues that those practices are drawing boundaries that set the terms of privacy law from the ground up. In other words, privacy law’s practices are descriptively and normatively performative: they have socially constructed what we think privacy law is and should be. We have not only become accustomed to conceptualizing privacy law in certain ways; we have come to see a model of individual rights and internal procedural compliance as the normal, ordinary, commonsense modality of privacy law. So constructed, privacy law is flawed, with substantial negative effects for individuals, society, equality, and justice.

This Article provides a full critical account of the latest developments in privacy law, focusing on its practices rather than law on the books. It details and challenges current privacy law’s focus on individual rights and internal compliance. And it explores potential new directions for privacy law based on the performative capacities of privacy law’s practices, including new emancipatory practices and performances.

Table of Contents Show

    Introduction

    In the last four years, there have been eleven proposals for comprehensive privacy legislation introduced in the United States Congress.[2] Two ballot initiatives and thirty-nine privacy bills have been introduced in twenty-eight states during that time.[3] This is in addition to the European Union’s General Data Protection Regulation (GDPR), which took effect in 2018, and the California Consumer Privacy Act (CCPA), which took effect in 2020.[4] That is an unprecedented flurry of legislative activity.

    Remarkably, most of these proposals look roughly similar: they add a combination of individual rights of control and internal compliance structures (the rights/compliance model) to the traditional model of privacy notices and consent buttons.[5] This means that policymakers seem committed to, or stuck on, a single model of privacy governance.

    This uniformity is notable, as is policymakers’ coalescence around the rights/compliance model of privacy law. It is unusual that politically polarized states—the “laborator[ies]” of very different visions of democracy—and a starkly divided Congress would roughly agree on a single framework for new privacy laws.[6] After all, there are other options on the table.[7] The choice of a rights/compliance model is even more surprising given that scholars generally agree that employing this framework in the U.S. regulatory context would be risky.[8]

    Previous scholars have looked to the GDPR or the CCPA for inspiration, suggesting that new proposals are the products of legal and norm entrepreneurship by leading regulatory jurisdictions or individual actors.[9] These analyses are illuminating, but incomplete. Some take a law-on-the-books approach, ignoring the ways in which law is also a social practice involving regulators, lawyers, compliance professionals, and individuals.[10] Others take a more nuanced approach, exploring how advocates harness institutional apparatuses to create privacy law.[11] But to look for origins and catalysts misses substance and efficacy. We need to know why policymakers chose these proposals and, more importantly, whether they are up to the task of protecting privacy in an era of data-extractive capitalism. [12]

    This Article answers those questions. My descriptive claim is that privacy law can be understood as a collection of repeated and habituated performances that have normalized themselves among regulators, industry, and individuals as what privacy law is and should be, thereby excluding other options. Privacy law’s performances, including internal compliance programs, privacy impact assessments (PIAs), consent toggles and opt-out buttons, consultations and settlements with industry, and exercises of individual rights have constructed privacy law in ways that entrench themselves from within. When practitioners complete PIAs or internal audits, they become accustomed to thinking that filling out documents is privacy law. When companies hire a chief privacy officer (CPO), they send a message to industry that hiring a CPO is privacy law. And when websites send emails saying they “care about” our privacy and require us to opt out of tracking, we become accustomed to thinking that self-governance and corporate management of our data is privacy law.

    Some of these practices predate the GDPR and the CCPA;[13] others were developed by industry in the wake of the GDPR. And individual rights beyond notice-and-consent are decades old.[14] But all of these practices are performative, and our acculturation to them has entrenched them and defined our relationship to, and assumptions about, privacy law. Habits die hard, and these habits are not just getting stronger; they make it impossible for us to change. No wonder many new proposals look the same.

    As described in more detail in Part I, performances are actions and behaviors that communicate something to the self and others.[15] We dress, speak, and interact in ways that reflect our identities and the identities we choose to share with others.[16] Following sociological and critical theory, these performances can also be performative—that is, repeated, everyday performances constitute, create, and reinforce social or legal categories, including identity, gender, and race.[17] Likewise, I argue that practices associated with individual privacy rights and corporate compliance have performatively constructed the category of privacy law by habituating us into thinking that only these practices are what privacy law is and should be.

    Part II applies the performativity thesis to privacy law’s practices, demonstrating how the information industry has entrenched practices that have influenced and molded the current wave of privacy law proposals.[18] Individuals, regulators, and industry all engage in performative practices of privacy law. Individuals navigate consents, cookie requests, privacy policies, and data request links. Supplementing privacy self-governance are practices in which regulators partner with industry to settle disputes and develop rules and where industry creates internal compliance structures for ongoing accountability. These practices have percolated up, constructing a roughly uniform approach to privacy law reflected in almost every recent proposal for comprehensive privacy law in the United States.

    But those practices are not necessarily good for privacy. Privacy law’s performances are constructions of industry. And as a result of internal inconsistencies and commitments to symbols and procedure, the model accustoms us to hollowed-out public institutions and insufficient privacy protection. In Part III, I make a normative claim that the performativity of privacy law’s practices not only explains why current privacy discourse and proposals from U.S. policymakers look roughly the same, but also shows that the rights/compliance model is likely incapable of addressing the privacy and structural harms of informational capitalism.[19]

    In particular, understanding privacy law from the perspective of performance highlights two categories of weaknesses in current proposals. One set of weaknesses stems from the laws’ individual rights approach, which is not only based on faulty assumptions, but also entrenches performances that are inherently mismatched against the structural harms of informational capitalism. The performative nature of rights in privacy law, which has habituated us into thinking that managing our privacy is an individual responsibility, has also allowed industry to weaponize our exercise of those rights to undermine our privacy. A second set of weaknesses is based on privacy law’s reliance on internal corporate processes. As Margot Kaminski has already warned, the social construction of privacy law around industry practices tends to favor the practices of the wealthiest and most dominant actors in the information industry, creating an anti-competitive landscape.[20] But the proposals’ problems run deeper. The law’s use of procedural performances as its regulatory lever also habituates privacy professionals, ignores data-extractive capitalism’s inconsistency with democratic values, and adopts neoliberal assumptions about the law’s place in economic ordering. Perhaps most importantly, the performative use of a managerialized public-private partnership is internally inconsistent: it endogenously creates public institutions that are dependent on industry expertise, efficiency, and nimbleness. Therefore, those public institutions become incapable of acting as the promised “backdrop threat” that guards against capture.[21]

    The inadequacies inherent in recent U.S. privacy proposals require different, opposing performances that can socially construct privacy law and regulatory institutions as counterweights to corporate power. The goal is not to eliminate performances and the performativity of practices; that’s not possible.[22] Rather, the goal is to perform privacy law in emancipatory ways—namely, to address the ways in which data-extractive capitalism creates vulnerabilities, power asymmetries, and subordination.[23] Part IV outlines an alternative framework inspired by what André Gorz called “non-reformist reforms,” or reforms that raise our consciousness of our subordination, while taking us closer to the ultimate goal of transformational change.[24] Rebuilding public governance will take work, will, and money, but society already has the tools to start: collective power, penalties, invigorated public institutions, civil rights, worker unionization, and a seat at the table not just for civil society, but for marginalized populations whose voices have been drowned out by a neoliberal focus on what industry wants. Even these changes will not achieve the ultimate goal of a radically reconstituted public regulatory space. But they are the beginnings of a new approach. We must walk before we can run.

    To review, this Article proceeds as follows. Part I brings together two related literatures in sociolegal studies: performativity and the endogeneity of law. This Section adds to the extant legal scholarship on performativity by focusing on how practices can create law. Part II applies the performativity thesis to privacy law’s practices, showing how long-standing practices of regulators, industry, and individuals have become the defining features of recent proposals for comprehensive privacy law in the United States. This Section also contributes to the privacy literature in another way—namely, by focusing less on the specific provisions of the laws, and more on the regulatory, corporate, and individual practices that develop in their wake. Part III makes the Article’s normative argument that the privacy law these performances have constructed is incapable of addressing the privacy and equitable harms of informational capitalism. Finally, Part IV proposes several alternative performances that could rescue privacy law from its rut.

    I. Performativity and Endogenous Law

    This Article brings together three related theories from the social sciences to explain the current status and failures of privacy law: performativity, habit, and normalization. Performativity is the idea that our actions can create social categories, like identity. Habit is one process through which that happens, and normalization is the result. This Article applies those theories from understanding identity to understanding law.

    Part I.A focuses on performativity and how individual practices construct personal identities, a process that relies on repetition, habituation, and normalization. When legal scholars have relied on performance theory to make arguments about the law, they have traditionally focused on this aspect of performativity. But practices do not only create identities; law itself is a product of people, practices, and discourses that determine what we think the law is and should be. This second literature, which focuses on the endogeneity of law, is discussed in Part I.B.

    A. The Performativity Thesis

    Privacy law practices are what J.L. Austin would call “performatives”: they create the reality of privacy law.[25] Judith Butler famously argued that we construct the category of gender by performing it.[26] We dress, speak, have sex, cut our hair, and adopt physical mannerisms associated with, and constitutive of, our gender identities.[27] Performances have the capacity to create social meaning and social categories—that is, our performances are performative.[28]

    The same is true for race and other identities.[29] Relying on this performativity thesis, legal scholars have argued that antidiscrimination law is underinclusive because it elides the many bases of discrimination that are performative of identity, such as hair styles, clothing, and recreational activities.[30] The performativity thesis has also allowed scholars to show how the practices that dominant social norms expect of parents has endogenously influenced family law from the ground up.[31]

    B. Repetition and Habit: How Performativity Happens

    One way our performances create social categories is through habit. Butler argues that our performative identities “materialize” gradually both from the top down and the ground up; they are the “processes of being acted on” and “the conditions and possibilities for acting.”[32] A top-down phase involves social norms acting upon us, where society’s strictures try to normalize us to align with its sociocultural histories. Butler suggested that this is the phase in which many of us are influenced to conform to how society says “men” and “women” should speak and act.[33] A bottom-up phase involves the capacity to react to those norms. As Maren Wehrle suggested, we “reproduce . . . norms in ways we might cho[o]se” from the ground up, sometimes in accordance with dominant paradigms and sometimes subverting them directly.[34] For instance, we develop new ways of walking, dressing, thinking, speaking, and behaving. They become typical for us, becoming part of who we are and how we define ourselves to others. Either way, whether we are affirming or disrupting social norms, our performances must be repeated in order to situate ourselves and our actions within society.

    Arguably, that happens through a process of habituation. Theories of habit date back to at least Aristotle, who saw habit as essential for promoting virtue; habitually acting morally—that is, repeating over and over again the moral act—constructs a character that has internalized the norms embodied by those moral acts.[35] Similarly, Butler’s “materialization” happens precisely because we operate through habit.[36] We generate habits within existing power structures. When a future Olympian learns to swim, for example, they have to practice, drill, and repeat. Eventually, their body becomes habituated to the movements of their arms and legs, holding their breath, and turning their head at specific times during those motions. But they accumulate these habits within certain rules and limits, whether imposed endogenously (they can only hold their breath for so long) or exogenously (they have to stay in their lane in the pool).

    We can also acquire new habits that make us excel. In grade school, we learn to write under strict rules: never end a sentence with a preposition; always have a thesis, body, and conclusion; never start a sentence with “and.” And yet, as we read more, write more, and learn more, we develop new “ways of being” that may challenge the rules in which we learned to write in the first place. Those new ways of writing become part of who we are as writers, generating our habitual identities from the ground up.

    Although repeating performances can affirm identity,[37] habituation can also normalize deviant behaviors as ordinary, commonsense, obvious, and objectively good.[38] Political scandals are good examples of this phenomenon. As psychologists Adam Bear and Joshua Knobe have written, when a politician “continues to do things that once would have been regarded as outlandish, [their] actions are not simply coming to be regarded as more typical; they are coming to be seen as more normal. As a result, they will come to be seen as less bad and hence less worthy of outrage.”[39] Similarly, when workers and those around them repeatedly cut corners, break rules, and ignore risks, they stop seeing those behaviors as deviant and come to see them as normal.[40] This happens in our daily lives too: studies show that the more television we watch, the more likely we are to think that watching a lot of television is normal.[41] Therefore, normalization is the confusion of frequency with propriety, nudging us to think that the things we do often are the normal things people do.

    The takeaways from this literature are particularly important for privacy law and policy. First, performances are everywhere.[42] As Butler noted, “to understand identity as a practice . . . is to understand culturally intelligible subjects”—that is, “understandable” as a result of “a rule-bound discourse that inserts itself in the pervasive and mundane signifying acts of . . . life.”[43] It makes sense then to consider privacy law’s practices as performances. Second, performances are expressive. Performances “signify[]” identities by demonstrating to the self and to others how performers understand and occupy their roles.[44] This suggests that to understand privacy law, we should look at how privacy is actually performed, not necessarily what is written in the law.[45] Third, in order to socially construct the law, performances must be widespread, repeated, and pervasive.[46] Butler noted that we are so compelled to engage in and repeat performative acts, because that is how we communicate—both to ourselves and to others—that this is who we are.[47] Finally, pervasive practices can have normalizing effects, pushing us to think that our routinized practices are the normal, appropriate, and normatively good practices. Therefore, the practices of privacy law we should study are those that are routinized and repeated among individuals, companies, and regulators.

    C. Performances and Endogenous Law

    A related sociolegal research agenda on performative practices focuses less on personal identity than on how practices can socially construct the law from the ground up. For instance, Lauren Edelman used a case study of Title VII of the Civil Rights Act of 1965 to argue that ideas and practices that emerge endogenously from regulated entities themselves can shape the law.[48] Title VII prohibits employment discrimination on the basis of race, sex, and other protected classifications.[49] But lawyers and compliance professionals working inside industry recast their obligations from substance—race and gender equality in the workplace—to procedure—nondiscrimination policies, diversity officers, appeals processes, and other internal organizational structures.[50] This process of “managerialization” transformed corporate symbols of compliance into weapons against claims of discrimination through which companies were able to point to their policies and organizational structures as evidence of fair treatment.[51] And these performances of accountability ultimately defined the law when federal courts not only accepted corporate procedures and practices as evidence of compliance, but also deferred to them as to what the law actually requires.[52] Although she never used the language of the performativity thesis, Edelman nevertheless argued that the practices of Title VII socially constructed the legal category of antidiscrimination law.[53]

    The practices of privacy law are having a similar impact. But legal scholars have insufficiently conceptualized privacy law’s practices and inadequately understood how those practices have come together to create a raft of new, roughly similar privacy proposals that ultimately benefit data-extractive corporations. The next Section applies the performativity thesis to privacy practices. It demonstrates that the regulatory, corporate, and user practices of privacy law have performatively created what we think privacy law is and should be, a vision reflected in almost every recent proposal for comprehensive privacy law in the United States.

    II. Performativity and the Social Practices of Privacy Law

    Recent proposals for comprehensive privacy law in the United States can be understood as a collection of long-standing practices of regulators, industry, and individuals. Those practices are performative: they construct privacy law by habituating us into thinking those practices are what privacy law is and should be. Departing from the tradition of some other scholars, this Section consciously takes as its starting point the law on the ground—namely, actual practices of regulators, industry, and individuals—rather than the law on the books. Following Butler, who argued that performances predate and construct identity, I argue that privacy law performances construct privacy law, making normative choices along the way.[54] The best evidence of this is that these practices, many of which predate the GDPR and the CCPA, have come together to form the latest proposals for omnibus privacy laws in the United States, both at the federal and state levels. The remarkable uniformity of these statutes—they all propose to codify many of the same practices—speaks directly to the power of performances in shaping the law.

    A. Regulatory Practices

    There are two primary privacy enforcers in the United States—the Federal Trade Commission (FTC, or the Commission) and state attorneys general (AG).[55] They are empowered to write rules and enforce the law. But in practice, things are different. Rather than holding industry to account, regulators have traditionally positioned themselves as industry partners in order to gain corporate buy-in.[56] They also settle rather than litigate almost all claims. The FTC’s strength as a regulator may ebb and flow with new appointments and new political majorities, but even under the leadership of Chair Lina Khan, the FTC still explicitly relies on industry to perform regulatory tasks. These practices are performative of privacy law: industry input and compromise have been built into most recent privacy proposals.

    1. Regulator as Partner

    The FTC and state AGs have long held meetings with industry representatives to persuade them to self-regulate and to solicit input on how to govern data collection.[57] The FTC meets with industry representatives regularly to “monitor the marketplace.”[58] The Commission has also published reports on online profiling, e-commerce, and consumer debt collection, among other issues, only after meeting with, and receiving significant input from, industry representatives.[59] As Danielle Citron has shown, state AGs have established task forces with representatives from business and advocacy groups to try to reach consensus on best practices.[60] They have also brought companies together to determine what those best practices should be and to hear how companies are approaching compliance, often adopting those compliance measures as recommendations.[61] These consultations are widespread and routine, and they date as far back as at least 2012, and perhaps earlier, long before the GDPR or the CCPA.[62]

    Consultations with industry have normalized privacy regulators as partners or allies of industry. Jon Leibowitz, former FTC Chair, Davis, Polk, & Wardwell LLP partner, and co-chair of the industry-funded 21st Century Privacy Coalition, said that “promot[ing] . . . business innovation” is one of the FTC’s goals and it “motivates industry” to achieve it.[63] Another former FTC Commissioner, Julie Brill, recently said that privacy regulators can use compliance safe harbors “to work with companies” and to “help them understand what other companies are doing.”[64] Regulators and their staffs also present themselves as wanting to help facilitate innovation,[65] provide clarity about rules to guarantee predictability,[66] and ensure industry that they are committed to a regulatory “light-touch.”[67] Indeed, as one assistant state AG told Professor Citron, “[w]e want companies to tell us how we can be clear about what we expect and how that clarity can help them satisfy the law and innovate.”[68]

    This normalized practice has made its way into new privacy statutes. Recent privacy proposals explicitly require the FTC to consult with industry to develop the rules and regulations industry must follow. For instance, the Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act (SAFE DATA Act) requires the FTC to develop rules in consultation with “a professional standards body” made up of large technology companies to define the terms under which industry can collect children’s data.[69] Elsewhere, the proposal authorizes the FTC to approve voluntary consensus standards and certification programs that companies developed on their own or in consultation with FTC staff.[70] The Privacy Bill of Rights also requires the FTC to reach out to companies and provide compliance guidance in line with “recognized industry practices.”[71] And the Consumer Online Privacy Rights Act (COPRA) authorizes the FTC to accept data security standards issued by the National Institute of Standards and Technology,[72] an arm of the U.S. Department of Commerce, which works with industry to “promote U.S. innovation and competitiveness.”[73] Similarly, in many states, AGs are required by statute to consult with companies before bringing enforcement actions.[74] New state proposals require AGs to write clarifying rules only after consultation with business.[75] And those rules also must “take into account the burden on the business.”[76] In short, industry input has moved from practice to law on the books.

    2. Compromises and Settlements

    In addition to meeting with companies to develop rules and best practices, regulators generally resolve their privacy enforcement actions through consent decrees rather than litigation.[77] Likewise, state AGs sign informal assurances of voluntary compliance (AVCs) with companies under investigation.[78] Consent decrees, like AVCs, are agreed-upon settlements; they function more like contracts than court orders.[79] This practice has become part of privacy law.

    Settlements are routine. Only three of the FTC’s 271 reported privacy and security enforcement actions since 1998—twenty years before the GDPR went into effect—ended in a judicial opinion from a federal court.[80] The rest ended in consent orders and default judgments.[81] These settlements generally follow the same script: companies under investigation do not admit fault or assume responsibility for deceptive business practices, but they nevertheless pay fines, amend privacy notices, and promise to stop deceptive practices or adopt new practices.[82] State AGs frequently enter into individual and multi-state AVCs with companies.[83] Indeed, Professor Citron found no fully litigated AG enforcement actions related to privacy.[84]

    These settlements are also expressive. Consent decrees convey messages to regulated entities and the public about how the FTC understands its role as a regulator. As Daniel Solove and Woodrow Hartzog have noted, privacy lawyers “parse and analyze the FTC’s settlement agreements, reports, and activities as if they were pronouncements by the Chairman of the Federal Reserve.”[85] Like other areas of law, FTC consent decrees have expressive value that influences norms on the ground.[86] They express what kinds of privacy practices the FTC thinks are appropriate and what practices are unfair, deceptive, or misleading. Even those commissioners in dissent try to influence corporate practices and the course of FTC actions in their dissents.[87] The FTC also announces consent decrees with fanfare, press releases, quotations from commissioners about corporate accountability and consumer welfare, and a media blitz about any fines it imposed.[88] AGs, eager to burnish political bona fides, have presented their informal agreements with companies as accomplishments on behalf of the privacy rights of state residents.[89] They also discuss their settlements with the public in order to persuade companies to change their behavior, relying on what Citron called their “privacy-norm entrepreneurship.”[90]

    Scholars have also referred to these orders as a privacy “common law,” constructing law case-by-case like a state court constructing tort, contract, or property law.[91] Even more notable, however, is the way in which settlement practices have become part of recent privacy law proposals. Some states require their AGs to settle with those companies under investigation before even being allowed to pursue litigation.[92] Other states explicitly envision their AGs relying on AVCs or Assurances of Discontinuance even if the statutes do not require it.[93] The CCPA explicitly envisions its AG settling enforcement actions.[94] Hawaiʻi’s Office of Consumer Protection would enforce that state’s privacy law, but the Office has no history of litigating privacy enforcement actions.[95]

    This is nothing new. The Administrative Conference of the United States long ago found that federal agencies “resolve the great majority of civil money penalty cases without reaching the stage of formal administrative adjudication or court collection proceeding.”[96] The U.S. Department of Justice has also noted that “even where formal proceedings are fully available,” as in the case of most new privacy proposals, “informal procedures constitute the vast bulk of administrative adjudication and are truly the lifeblood of the administrative process.”[97] Scholars have found the same to be true of the FTC and state AGs in the last forty years.[98] Codifying the availability of enforcement actions will do nothing to change this practice and may even entrench it further.

    3. Industry as Self-Regulator

    In addition to tinkering with corporate notices, regulators now require companies to develop internal organizational structures for data governance.[99] This began in the United States in 2011, when Google agreed to establish a “comprehensive privacy program” designed to assess the privacy risks of new products and to protect the privacy of collected information.[100] This requirement then became the norm.[101] State AGs soon followed suit. After Google collected data from unsecured wireless networks through its Street View cars, thirty-eight states and the District of Columbia pushed the company to agree to build a privacy program, designate a privacy coordinator, train employees, and create new internal policies and procedures on privacy practices.[102] In People v. Payday Loan Store of Illinois, the state required the company to provide employee training and adopt new internal privacy protocols.[103] Similarly, in State v. Villareal, Texas required a company to develop comprehensive security programs.[104] And in In re HealthNet, New York settled an investigation by requiring the company to train staff, develop new internal programs, and conduct security audits.[105] The list goes on.[106] Therefore, it has become routine for regulators to shift the burdens of ongoing monitoring and governance to industry itself.

    A central piece of privacy governance is the audit. Indeed, most FTC privacy consent decrees have required companies to conduct biennial “assessments” to ensure they are complying with the order.[107] Companies identify, hire, and verify the qualifications of the assessor themselves.[108] These audits are performances, and cynical ones at that. They use boilerplate language.[109] They follow a standard script: the company hires an outside assessor who comes in every two years to ask the same standard set of questions.[110] All of them are answered by executive attestation, meaning that an assessor concludes that a company is complying with an FTC order based solely on the assurances of corporate executives.[111] For instance, Google’s assessor found that the company’s new privacy program met FTC requirements, but only appended the company’s privacy program statement as proof.[112] Uber’s assessors did not complete an independent investigation, either; they relied solely on “data security policies” and interviews with executives to conclude that the company was meeting its requirements.[113] Therefore, assessments are little more than pre-written scripts on the front stage, complete with dialogue from defined actors and repeated over and over again like a long-running show.

    Despite this, assessments are nevertheless performative: they socially construct privacy regulation. David Vladeck, a former Director of the FTC’s Consumer Protection Bureau, has called assessments an “important” part of the FTC’s work.[114] In formal response to public comments about its 2012 settlement with Facebook, the FTC told many commentators that more robust audits were unnecessary: “The Commission believes that the biennial privacy assessments described above will provide an important means to monitor Facebook’s compliance with the order.”[115] Assessments are now routine parts of FTC practice. As such, privacy professionals and privacy lawyers expect assessments as a matter of course.[116]

    These audits have moved from practice to statutes, as well. Minnesota would require data collectors to audit their own privacy programs and those of their partners and vendors.[117] COPRA tells companies to hire an external auditor to assess their privacy practices.[118] The Mind Your Own Business Act (MYOBA) requires companies to complete annual attestations of compliance with written statements and affirmations from company executives and the CPO.[119] The Privacy Bill of Rights mandates regular audits of internal privacy and security practices, completed either internally or by an independent assessor.[120]

    These practices open doors for industry to bring its experts to the table and to influence its own regulatory context. As the next Section describes, recent proposals would also codify many other organizational practices that predate the GDPR, including internal offices, policies, and programs that document ongoing compliance.

    B. Compliance and Internal Structures

    Traditional privacy law in the United States began with regulators disclaiming any interest in privacy regulation.[121] As a result, data collectors voluntarily posted privacy and data use notices.[122] That practice was performative of privacy law: eventually, the FTC, several federal laws, and state laws like the California Online Privacy Protection Act codified these practices into law.[123] In this way, the routinized performance of writing and posting privacy policies created what policymakers and industry thought privacy law should be. A similar process is happening now, but instead of codifying mere notice, proposals for omnibus U.S. privacy laws would also codify a series of internal governance practices that some industry players innovated since long before the GDPR. These practices have again constructed the category of privacy law. But this time, instead of constructing a self-regulatory regime, they have built one characterized by managerialized compliance.

    1. Managerialized Compliance

    Managerialism is the “infusion of managerial or business values and ideas into law.”[124] Many recent proposals for comprehensive privacy law in the United States explicitly envision that compliance professionals—privacy professionals, privacy lawyers, and other compliance experts—will bring the law into their organizations, translate its requirements for their bosses, and implement it throughout the company. But along with that shift in responsibility comes the “reconceptualization of law so that it is more consistent with general principles of good management.”[125] Theoretically, managerialism is agnostic as to legal values; good management is not necessarily in conflict with the underlying purposes of social legislation. But managerialism does make regulated entities themselves the intermediaries between the laws on the books and the people those laws are meant to protect. That gives industry the power to define what the law means in practice.

    Managerialism can, therefore, undermine what scholars call collaborative governance. Collaborative governance is an approach to regulation that relies on a partnership between public authorities and private actors to achieve regulatory goals.[126] Collaborative governance, at its best, is “a highly tailored, site-calibrated regulatory system that aims to pull inputs from, obtain buy-in from, and affect the internal institutional structures and decision-making heuristics of the private sector” while maintaining popular legitimacy and achieving better social welfare outcomes.[127] In the privacy space, collaborative governance is meant to supplement privacy’s traditional reliance on transparency, notice, and consent.[128]

    In collaborative governance, the government plays the role of a “backdrop threat” that encourages private sector engagement, convenes regulated entities and civil society together, certifies compliance protocols, and, if necessary, enforces the law when things go awry.[129] Private actors develop the systems of compliance on their own with the government as a top-down regulator.[130] To ensure accountability, collaborative governance relies on negotiated settlements, safe harbors, codes of conduct, audits, informal delegations of interpretive authority to private actors, impact assessments, ongoing self-monitoring, and incentives for private ordering in the public interest.[131] The goal is to keep sufficient flexibility in the legal system so regulated entities will want to participate and to ensure companies do so for the public good.[132]

    Proponents see several benefits to the collaborative model. Public-private partnerships bring private sector expertise to governance, which proponents believe especially necessary in the complex and highly digitized information economy.[133] Technological development also moves fast, so the collaborative governance model offers “an ongoing, iterative system of monitoring and compliance” in place of the long, drawn-out process of administrative rulemaking.[134] The model also enhances industry buy-in and perceived legitimacy by giving regulated entities a seat at the table and enabling them to help regulators craft workable solutions.[135] In short, there are reasons collaborative governance is so popular.

    Proponents also recognize the dangers of the approach. Collaborative governance requires substantive outer limits to prevent everything—including protecting basic human rights—from boiling down to an ongoing negotiation with a profit-seeking corporation.[136] For collaborative governance to work, rights must be clearly defined and judicial review, in addition to large fines, may be necessary to constrain corporate actions at the margins.[137] As the next Section shows, scholars are right to worry. Managerialism has taken hold in practice, undermining privacy law in the process.

    2. Performative Managerial Practices

    Research conducted inside the information industry demonstrates that privacy leaders, privacy lawyers, and other professionals have long built internal corporate structures as part of their privacy compliance work. Based on interviews with several CPOs regarded as leaders in the field, Kenneth Bamberger and Deirdre Mulligan found that privacy professionals created a “company law” of privacy to fill in the gaps left open by privacy law on the books.[138] These professionals drafted internal rules for data processing in accordance with the E.U.’s 1995 Privacy Directive.[139] They also conceptualized privacy in terms of risk management and developed processes for assessing and documenting that risk.[140] Companies created new privacy offices and hired staff.[141] They started training their employees on privacy and security, designated some of them as privacy officers, and put them to work building new procedures and setting new policies.[142] Audits of privacy practices were part of the corporate routine as early as 2009.[143] This suggests that privacy practitioners were building internal organizational structures long before the GDPR.

    Those practices have socially constructed privacy law. FTC consent decrees now require companies to create a “comprehensive privacy program,”[144] which includes hiring staff, situating staff inside organizational hierarchies, completing risk analyses for new products, and developing privacy trainings.[145] Companies also have to conduct biennial assessments of that program.[146] Ten proposals for comprehensive U.S. privacy law would codify some or all of these requirements.[147]

    Indeed, new proposals would codify many of these internal corporate practices, including trainings, recordkeeping, and privacy risk assessments.[148] Four proposals state that companies must hire a CPO or designate a privacy officer.[149] Several laws require companies to create “organizational” measures, like comprehensive privacy programs, to ensure compliance.[150] Other internal governance measures include regular audits of processors, vendors, and the privacy programs themselves.[151]

    Some requirements in these proposals are new. A few proposals ask industry to develop internal processes for ensuring that third-party vendors comply with the law,[152] certify compliance with executive attestations,[153] and develop standard disclosures.[154] A bill in Minnesota would require an internal appeals process, and five other state laws require independent tests and annual impact assessments of automated processing or facial recognition.[155] The Mind Your Own Business Act (MYOBA) would require companies to develop an internal process to track opt-out requests of consumers with whom they are not in a direct relationship but whose data they nevertheless hold.[156] And the SAFE DATA Act calls on a “professional standards body” to write its own rules that, if followed, would constitute compliance with the law.[157] But these new requirements are in line with the old. For some time, privacy law has relied on internal corporate governance structures for ongoing monitoring and compliance. Bamberger and Mulligan found that those practices have normalized themselves, and now, new privacy statutes are based on them. That is the essence of performativity: legal categories defined by behaviors on the ground that express what privacy law is and should be.

    C. Exercising Rights of Control

    Privacy law has always centered the idea of control: notices and consent privileges help people “make decisions about how to manage their data.”[158] As a result, Daniel Solove characterized traditional privacy law’s notice-and-consent regime as “privacy self-management,” involving “the various decisions people must make about their privacy and the tasks people are given . . . to do regarding their privacy, such as reading privacy policies, opting out, changing privacy settings, and so on.”[159] These tasks are performances: toggling consents, click-to-agree buttons, and confirming or rejecting cookie requests. As the philosopher Gordon Hull has argued, the routinization of these practices has inured us into thinking that privacy self-management is privacy law.[160] Recent privacy law proposals in the United States reflect as much. They may add additional rights of control, but they follow the same script: we have to navigate our own privacy through clicks and consents on digital platforms themselves.

    1. Discourses of Control

    Industry almost exclusively uses the discourse of control when its representatives talk about their privacy work. Although research into nonexpert visions of privacy suggests that we think about privacy in many different ways, many tend to echo notions of control as well.[161] These discourses are pervasive and routinized pieces of the information economy.

    Mark Zuckerberg used the word “control” forty-nine times in one Senate hearing to refer to Facebook’s privacy work.[162] In 2020, Zuckerberg said the company changed its platform “to protect user privacy and give people more control.”[163] At a 2019 hearing before the Senate Commerce Committee, Jon Leibowitz testified that the “framework” for a federal privacy law should give “consumers more control over their data.”[164] His proposals called for giving consumers “statutory rights to control how their personal information is used and shared,” and “promot[ing] consumer control and choice by imposing requirements for obtaining meaningful consent.”[165] Michael Beckerman, the President and CEO of the Big Tech-funded Internet Association, expressed that people should have access to and control of their data.[166] Beckerman suggested that legislation should “empower[] people to better understand and control how personal information they share is collected, used” and should include “the development of tools to give users more control over their personal information.”[167] In 2018, Bud Tribble, then-Vice President for Software Technology at Apple, and Rachel Welch, Senior Vice President for Policy and External Affairs at Charter Communications, made similar comments.[168]

    Sundar Pinchai, CEO of Alphabet, Google’s parent company, has said that he “always believed that privacy is a universal right and . . . Google is committed to keeping your information safe . . . [and] putting you in control.”[169] The Engine Advocacy and Research Foundation—a lobbying group funded by Google, but claiming to be a voice for entrepreneurs—told Congress to pass a “robust” federal privacy law that “provide[s] transparency, control, and user choice.”[170] The National Association of Realtors also wants the same.[171] Keith Enright, Google’s then-Chief Privacy Officer, told a Senate committee in 2018 that Google’s “key elements” for any privacy discussion are “transparency, control, portability, and security.”[172] Executives at Twitter repeated the privacy-as-control discourse, noting that “privacy” means the company “should be transparent about, and provide meaningful control over what data is being collected, how it is used, and when it is shared.”[173] All in all, in more than fifteen hearings between 2015 and 2020 before the Senate Commerce Committee alone, information industry executives pushed the discourse of privacy-as-control every single time.

    Control also permeates popular conceptions of privacy. When asked to illustrate their mental frames about privacy through drawing and art, many participants in a Carnegie Mellon study drew images of control levers and wrote captions about the “right to control” or to “choose” what things in a wallet to share with others.[174] And more than half of the individuals included in a study about privacy in densely populated areas defined privacy as either the “ability/power to control access to some thing, place, or piece of information and its dissemination” or “the freedom to do/live/make decisions,” both of which are based on control.[175]

    2. Privacy-as-Control as Performative

    The pervasive and widespread assumption that privacy is about control over data parallels pervasive and widespread practices of privacy-as-control. We read privacy policies, consent to data tracking on a website-by-website basis, click buttons to opt out of certain information processing, and otherwise take personal agency to exercise control over information.[176] These practices have been around for decades, and their repetition has a habituating effect. Gordon Hull suggested that repeating self-governance practices normalizes surveillance and habituates us into thinking that privacy law’s responsibilities fall to us.[177] Websites and apps deploying rights of control “present[] an information environment in which individuals see themselves as functioning autonomously.”[178] We take actions like we are in control by clicking “accept,” or clicking “agree,” or exercising our right to correct or opt out of data collection. And every time we do so, we are inculcated with the belief that these behaviors—the scaled detritus of privacy-as-control—are privacy law.

    Like Austin and Butler, Michel Foucault thought that our actions do not just achieve their immediate effects.[179] That is, clicking “agree” does more than just grant access to a platform. The behavior’s routinization and repetition have normalizing effects, making it seem like common sense and ordinary. Our actions “establish[] . . . a moral conduct that commits an individual, not only to other activities always in conformity with values and rules” associated with those actions, “but to a certain mode of being, a mode of being characteristic of the ethical subject.”[180] Put another way, exercising rights of control are repeated actions that socially construct our perception of what privacy law is and should be—for example, that self-navigation is the normal, commonsense thing to do. In this way, privacy law’s rights of control are performative because our exercises of those rights create a legal regime of individual rights. It should be no surprise, then, that most recent privacy proposals all guarantee similar individual rights of control.

    These proposals include the right to access data about us,[181] have our data deleted,[182] and opt out of tracking.[183] Some statutes guarantee a right to correct inaccurate or outdated data.[184] Some include the right to move data from one company to another, known as the right to portability.[185] Several proposals guarantee a right to restrict data processing.[186] The proposed New York Privacy Act would give citizens a right against purely algorithmic or automated decisions about their lives.[187] And the Data Accountability and Transparency Act, or DATA Act, guarantees individuals a right to request human review of automated decision-making systems.[188]

    Notably, some new privacy laws build on the notice-and-consent paradigm. Almost all state and federal proposals in the United States are opt-out regimes, which means that data collection and processing is presumed lawful unless individuals affirmatively withdraw their consent. Some proposals go further, doubling down on the power of consent. For instance, two proposals in Arizona would let technology companies sell customer data, avoid all restrictions on processing data about adults, and make decisions based on consumer profiling if they obtain consent.[189] Two proposals introduced in the Illinois Senate would allow companies to skirt limits on processing sensitive data, even processing that posed a significant risk to privacy, if they obtain consent.[190] And Maine’s privacy law, which took effect in 2019, lifts all restrictions on the use, disclosure, sale, and third-party access to personal information, if companies obtain consent.[191]

    Therefore, when viewed from the perspective of social practice, many recent privacy proposals in the United States reflect long-standing privacy-as-control discourses and practices. Even the rights themselves are not that new. The 1973 federal report from the Department of Housing, Education, and Welfare (HEW), which gave rise to early privacy law’s notice-and-consent performances, also called for rights of access and deletion, among other individual rights.[192] The repetition of those discourses and practices has had a performative effect—routinized privacy practices have become privacy law.

    D. The Emergent Law of Privacy

    Scholars trying to understand the evolution of privacy law have elided this point that routinized privacy practices have become privacy law. Anu Bradford suggested that a “Brussels Effect” would make all privacy laws accord with those of the E.U.[193] Bradford predicted that multinational companies would voluntarily adopt E.U. rules, in part, because of the E.U.’s unique combination of market power and regulatory capacity.[194] And since data flows are difficult to constrain within political boundaries, Bradford reasoned that companies in the information industry will be uniquely susceptible to the E.U.’s regulatory power.[195] E.U. law also bans data transfers from the E.U. to other countries if those countries do not have “adequate” data protection laws.[196] Therefore, Bradford predicted that industry and governments would strengthen their practices to meet E.U. demands.[197] However, Anupam Chander, Margot Kaminski, and Bill McGeveran rightly noted that the E.U. has had a privacy law for decades—the E.U. Privacy Directive, which went into effect in 1995, and did not spur Congress or the states to act.[198] They suggested that it was the legal entrepreneurship of leading privacy advocates in California, who took advantage of that state’s unique law-making process, that catalyzed the explosion of recent privacy proposals in the United States.[199]

    Implicit in Bradford’s argument is a formalistic distinction between law and society. Bradford looked to a law-on-the-books catalyst for other laws on the books, conceptualizing law as an autonomous institution off on its own. But sociolegal scholars and the Legal Realists have taught us otherwise.[200] In their view, the relationship between law and society is a reciprocal one, and one famously ignored by the legal formalists of an antiquated age.[201] Law reflects and influences social change, whether it be changes in the family or shifts to an industrial or information economy.[202] To think the law is only influenced by other law is to ignore society’s role.

    There are other limitations to the conventional wisdom’s focus on the GDPR’s or the CCPA’s influence. Some scholars put considerable faith in the norm entrepreneurship of a small group of privacy advocates who forced the California legislature’s hand in 2018 but neglected to consider what companies were already doing internally by that time.[203] These scholars recognize that the rights/compliance model was not invented by the CCPA, but insufficiently account for how that makes the narrative more complex. Privacy law-as-compliance in the United States dates as far back as 2011, when the FTC first required Google to develop a “comprehensive privacy program.”[204] Individual rights to access, restrict processing, and correction are even older; they were part of the original Code of Fair Information Practice recommended by HEW in 1973.[205] As such, they predate every E.U. privacy law—and even the E.U. itself.[206]

    Plus, neither the formalist nor realist theory explains why policymakers and advocates agreed on these proposals. Chander, Kaminski, and McGeveran did not characterize recent privacy proposals as a mix of rights and compliance, instead seeing them as primarily rights-based.[207] However, as discussed above, compliance is a critical piece of these proposals when viewed from the perspective of practice.[208] Even a proposal that simply allows individuals to access and delete their data requires companies to create internal processes to intake, assess, respond to, and implement those requests. Moreover, E.U. regulators have made it clear that there is no single path to adequacy.[209] And yet, U.S. lawmakers have chosen only one set of practices. They could have gone further and imposed substantive limits on data collection that would also win an adequacy determination. They could have taken Woodrow Hartzog’s advice and used various legal tools to ensure that privacy protections and anti-manipulation guarantees are designed into new technology products.[210] They didn’t; they all chose individual rights and compliance-based governance.

    Perhaps policymakers are risk averse or lack imagination.[211] Perhaps we are all steeped in the same governing discourses that define how we think about privacy, leading policymakers to adopt similar ideas that do not upset traditional structures of power.[212] Political scientists might explain the similarities by pointing to the Overton Window, or the theory that only a small set of policy options are acceptable in any given political moment.[213]

    But Overton Windows move. Discourses are challenged and replaced. And yet, recent privacy law proposals codify roughly the same social practices: they envision collaborative regulators, internal corporate compliance structures, and a series of rights to privacy self-management. This Section has shown why. Recent privacy proposals follow a rights/compliance approach because long-standing practices—industry input in regulations, settlements and consent decrees, self-audits, PIAs, recordkeeping, codes of conduct, privacy offices, and privacy self-navigation—socially constructed privacy law from the ground up. Most state and federal proposals would codify social practices of privacy that regulators, industry, and individuals have been engaged in for some time, long before the GDPR and the CCPA. The repetition of these performances may have normalized them, acculturating stakeholders to think that this is what privacy law is and should be. Policymakers could not think of other options because performances of privacy on the ground had already created the category of privacy law for them. And that, as the next Section indicates, is a problem.

    III. The Dangers of a Rights/Compliance Approach

    I have suggested that recent privacy proposals in the United States look the way they do because long-standing corporate, regulatory, and self-management performances have socially constructed privacy law in our legal consciousness. In this Section, I make a normative claim: the performativity of rights/compliance practices demonstrates why the approach is unlikely to achieve stronger privacy protections for individuals and is incapable of addressing informational capitalism’s structural asymmetries and discriminatory harms. The following Sections discuss two clusters of reasons for this: one based on the individual rights model and the other based on the compliance model.

    A. The Misplaced Individual Rights Model

    The practices associated with individual rights of control seem empowering: we can click on links to ask that our data be deleted, corrected, and moved. But although more control sounds like a good thing, individual rights will not solve collective privacy problems.[214] Habituating ourselves to the fiction that we, as individual users, are truly capable of managing our privacy online is precisely what the information industry wants. This is in no small part because this fiction allows technology companies to weaponize our exercise of those individual rights to immunize themselves from responsibility and accountability.

    1. Insufficiencies of Individual Rights Discourse

    The discourse of individual rights is dangerous if the law’s goal is to provide substantive privacy protections in the information economy. Granted, early privacy law and scholarship focused on individual rights.[215] But that narrow conception inadequately appreciates the privacy concerns inherent in the advertising-based business models of data-extractive capitalism. Social surveillance, for example, undermines our ability to think independently, eviscerates our autonomy, and turns everyday practices into information-sharing events.[216] Privacy also serves collective ends: protecting community, enhancing democracy, increasing solidarity, and ensuring ongoing social interaction.[217] For instance, as Robert Post argued, privacy is meant to “safeguard[] rules of civility,” rather than any individual right against eavesdropping or snooping.[218] And Julie Cohen has demonstrated that privacy is about establishing the parameters of social space in ways that make continued interaction with others possible.[219]

    Just like privacy is inherently a social construct, data-extractive capitalism can cause social harms. For example, data processing abets the entrenchment of traditional power structures and social and economic inequality.[220] Data-driven technologies routinely discriminate against persons of color, contributing to both higher rates of incarceration and glaring incidents of unjust deprivations of liberty.[221] And studies have found that information products have been used to take away welfare benefits from the poor, separate immigrant families, and subordinate women as victims of sexploitation.[222] Technology directly shapes collective lives and is deeply embedded in institutions that are structured to reinforce race, gender, and sexual orientation discrimination.[223]

    Learning those lessons, Salomé Viljoen argued that the information economy has a “sociality problem” in which individual rights ostensibly allow us to regulate “vertical” relationships with platforms, but cannot address the “horizontal” relationships among individuals who share the same relevant characteristics.[224] Because the information industry’s business model is dedicated to “deriving [] population-level insights [from] data subjects” that are then applied to individuals who share those characteristics through design nudges, behavioral advertising, and political microtargeting, what we share affects how others who are like us are treated.[225] That is, by merely using technologies that track and extract data from us, we become unwitting accomplices in the process through which industry translates our behavior into designs, technologies, and patterns that shape and manipulate everyone else. Abetting this system is a precondition of participation in the information age. For Viljoen, then, the information economy’s core evil is that it conscripts us all in a project of mass subordination that is—not so incidentally—making a few people very rich.[226] Even at their best, individual rights that only govern vertical relationships are insufficient to address or ameliorate that kind of subordination.[227]

    But discourse is only the beginning. Codifying these practices of privacy self-management directly undermines privacy protections. That may sound counterintuitive. In truth, individual rights to data can be weaponized by industry to erode privacy protections wholesale.

    Privacy law’s individual rights approach is based on the presumption that individuals have sufficient power and agency to exercise those rights autonomously and in accordance with their preferences. We do not.[228] Rather, the problem runs deeper. Reifying that assumption allows industry to weaponize individual rights—particularly the right to consent—against our privacy, undermining everyone’s ability to exercise rights of control in the first place.

    Traditionally, consent was the shibboleth of privacy law.[229] Proponents of the rights/compliance model make much about how laws like the GDPR are not consent-based regimes.[230] They are correct, but only to a point. Consent is not the only justification for processing personal data under the GDPR. Even when processing is pursuant to user consent, the individual rights and compliance requirements are supposed to remain in place.[231] But even these commentators acknowledge that individual consent is one of the two most common justifications for data collection under the GDPR.[232] And yet, as scholars have shown, wherever consent is operable in the information economy, it is both a weapon of data extraction and a shield against accountability.[233]

    On the misleading premise that individuals are capable of making their own informed choices about what they share and with whom they share it, industry weaponizes consent in ways that make other individual rights of control mostly meaningless. In 2019, for instance, while Facebook was trying to dismiss a lawsuit for the company’s failure to stop Cambridge Analytica from unlawfully mining user data, the company’s attorney told Judge Vince Chhabria that “[t]here is no privacy interest” in any information Facebook has.[234] Users “consent[ed]” to the terms of service and engaged in “an affirmative social act to publish,” which “under centuries of common law, . . . negated any reasonable expectation of privacy.”[235] When the judge asked if it would be an invasion of privacy for Facebook to break a promise not to share an individual’s information with third parties, Facebook’s counsel claimed that “Facebook does not consider that to be actionable,” citing user behavior and consents as evidence that users had given up control of their data.[236] In its briefing, the company went even further, arguing that because individuals “can control how” their content is shared, anything they then share is ripe for use by Facebook and third parties.[237]

    In Campbell v. St. John,[238] a case about Facebook’s practice of scanning users’ private messages to collect data for behavioral advertising, Facebook argued that users lacked standing to challenge any Facebook data practice because they “consented to the uses of . . . data.”[239] In Smith v. Facebook,[240] the company made the same argument, noting that Facebook should be allowed to track users wherever they go on the Internet, because users “are bound by their consent to those policies.”[241] And in In re Google, Inc. Cookie Placement Consumer Privacy Litigation,[242] Google moved to dismiss all claims pertaining to the unauthorized use of cookie tracking and the unlawful interception of user data by arguing that “both Plaintiffs and the websites they communicated with provided their consent for Google . . . when they sent a GET request . . . so that they could browse websites containing Google ads.”[243] In other words, Google claimed that the mere use of its search engine is tantamount to consenting to all of Google’s data use practices, putting the burden of any consequences on the individual user.

    Similarly, in Patel v. Facebook,[244] which challenged the company’s collection and use of biometric information, Facebook argued that no plaintiff could ever successfully bring a lawsuit against the company for use of any kind of information, let alone biometric data, because “plaintiffs knew exactly what data Facebook was collecting, for what purpose, and how to opt out of Tag Suggestions.”[245] Facebook suggested that this immunity was so broad that it held up even if the company’s notices were not sufficiently specific.[246] Facebook reasoned that since users consented to all data collection practices when they signed up for accounts, and since privacy law only requires choice, consent, and control, users who signed up but never opted out had given up their rights to their data.[247]

    Facebook has even argued that its own privacy promises are meaningless because it had the power to define the rights of its customers. For example, in several ongoing lawsuits, Facebook has argued that its promise to remove cookies that identify a particular user’s account was not a “promise[] not to record the communication[]” and that promises of anonymity do not create expectations of privacy.[248] In the same case, Facebook argued that all user information available to Facebook—including every website users visit—is “voluntarily disclosed.”[249] It is easy to see the company making similar arguments on the ground that individuals are freely capable of exercising their rights of access, deletion, correction, and opt out in order to hold users responsible for all data use practices that result.

    Notably, at all times during the five years in which Facebook’s and Google’s lawyers made these arguments, both companies had privacy-focused internal organizational structures in place.[250] Both companies had long been operating under FTC consent decrees that required, among other things, comprehensive privacy programs.[251] Both companies also claimed to be compliant with the GDPR as of 2018,[252] a year before Patel and two years before Facebook argued that the only way its users could expect privacy on the Internet was if they used a Virtual Private Network, or VPN.[253] Therefore, Facebook and Google demonstrate that having compliance systems in place and other rights available does not stop the companies from engaging in legal practices that erode privacy rights for users. Performative rights-based practices allowed these companies’ lawyers to conceptualize privacy law in a way that enables industry to “take[] refuge” in consent’s attendant immunity.[254]

    B. The Problem of Compliance

    A second category of structural weaknesses in recent privacy proposals stems from their codification of performative compliance practices. The following Sections identify three of those weaknesses. First, the law’s endogenous construction from corporate performances on the ground suggests that managerialized compliance will be dominated by the practices of industry leaders, which may conscript the law in favor of monopolists’ anticompetitive behavior. Second, the reliance on procedure elides substantive injustice below the surface and, therefore, leaves in place the inequities of data-extractive capitalism. Finally, and perhaps most importantly, the practical application of compliance-based governance is internally inconsistent, performatively creating public institutions that are incapable of holding industry accountable.

    1. Dominant Practices and Underinclusive Law

    The performativity of privacy law practices means that the law may be constructed by the repeated practices of the most dominant actors—namely, those with money, power, and the risk tolerance that comes with both. There are several reasons for this. These companies’ wealth, status, and market share allow them to take on greater litigation risks than their smaller competitors.[255] As such, dominant companies can afford to act first, and establish new compliance practices without clear guidance from regulators, just as envisioned by compliance-based governance. And perhaps because smaller competitors cannot afford the risks of investigation and litigation that come with improper compliance practices, industry standards and customs will coalesce around the performances of dominant players.[256]

    This coalescing behind the practices of the most dominant actors also happens organically. In many industries, professionals share their experiences and advice through formal outlets—namely, industry conferences, convenings, and publications, where the views of industry leaders are usually of keen interest to the rank-and-file.[257] The privacy industry has several large networking conferences, including several hosted by the International Association of Privacy Professionals (IAPP), attracting thousands of attendees worldwide, and the Privacy+Security Forum, which happens twice a year and brings together hundreds of professionals for panels, networking, and idea exchange.[258] Researchers have also shown that privacy professionals take advantage of their overlapping social networks to learn from colleagues at leading companies.[259] This effectively spreads the compliance performances of a small subset of industry actors across the field, reinforcing privacy law “isomorphism.”[260] Therefore, the most powerful corporations are able to entrench their compliance practices in the same way that a first entrant can claim a monopolistic position in a market.[261]

    Wealthier companies also have the resources to build larger in-house privacy departments that can dedicate time, money, and labor to compliance practices.[262] They can even offer compliance support to their customers.[263] By contrast, smaller companies are forced to outsource more of their compliance to privacy technology vendors, many of which make dubious claims about proprietary automated systems that purport to achieve compliance with pre-filled documents and paper trails.[264] Therefore, a long list of performative compliance practices almost exclusively come from the internal processes of companies that can afford to develop them.

    Dominant companies also have more influence over regulators and regulations. In addition to their multi-billion-dollar direct lobbying campaigns aimed at weakening privacy law,[265] the wealthiest technology companies have funded several trade organizations to research and publish policy white papers that reflect their interests.[266] Plus, representatives from the most powerful technology companies have been the most common invitees at congressional hearings on privacy.[267] And, given the revolving door between government service and lucrative positions representing technology companies, regulators have a serious incentive to develop stronger relationships with companies like Facebook and Google than with their far smaller competitors.[268]

    This is not merely a theoretical possibility. It is, in fact, precisely how many interactions play out between regulators and industry. The FTC routinely cites the views of the information industry’s largest players in its staff reports. For example, the FTC relied on statements from Google’s Director of Public Policy when it emphasized transparency and control in its mobile privacy guidance.[269] The report followed from the advice of the Retail Industry Leaders Association and the App Association, an industry trade organization funded by wealthy software development interests that calls for “limited government involvement in technology.”[270] The FTC also explicitly endorsed Facebook’s, Apple’s, and Google’s use of icons to communicate privacy information.[271] It adopted industry’s recommendation for self-regulation and an opt-in “Do Not Track” mechanism.[272] And regulators sided with leading technology companies to support self-regulation of the “Internet of Things.”[273] It stands to reason that these powerful interests will also have an advantage when they seek to certify their compliance practices and have their versions of best practices adopted as the industry standard.

    Therefore, wealthy corporations’ performances are more likely to construct the compliance landscape. But what is good for a monopolist is not usually good for society.[274] Entrenched powers have an interest in cementing their market positions, and many have used the law to do so.[275] Performative compliance practices can do the same.

    2. Procedures and Substantive Injustice

    The enforcement toolkit in recent U.S. privacy proposals is largely procedural: impact assessments, privacy officers, and internal policies. That means that laws will rely on internal organizational structures to protect the individual rights guaranteed on the face of the laws.[276]

    But these legitimizing procedures disaggregate legitimacy from substantive justice. Procedures offer “no framework for thinking systematically about the interrelationships between political and economic power.”[277] They substitute the “political judgment” of traditional regulation and government intervention with “technical management” of the market, thereby leaving unanswered and unresolved vexing questions of inequality, subordination, manipulation, and asymmetrical power.[278] After all, data can be a tool of oppression, whether it is exploited to train totalitarian facial recognition models, surveil protestors, incarcerate people, or subjugate vulnerable populations.[279] For those people society pushes to the margins, privacy is particularly important and data-extraction is particularly dangerous. Disclosures, data breaches, and industry negligence with pornography sites, WiFi-enabled sex toys, and femtech products undermine a core human right of sexual privacy for everyone, but the people who are most hurt by such privacy breaches are also the most marginalized in society.[280] Compliance practices do little to ameliorate or stop these harms other than to encourage companies to put their policies down on paper. There are, however, some practices that no amount of procedural due process can fix.[281]

    Worse yet, focusing on procedural safeguards may discourage policymakers from taking more robust actions. As Paul Butler argued in the context of the right to counsel, guaranteeing a procedural right—in that case, providing lawyers to indigent defendants—obscured the fact that the criminal justice and carceral systems are systemically racist and unjust to the poor.[282] Process, Butler argued, “invest[ed] the criminal justice system with a veneer” of legitimacy and discouraged reformers from digging any deeper.[283] Compliance practices open up privacy law to the same problem. Compliance governance only tries to address certain problems, such as the need to integrate privacy into every step of the design process, the complexity of the technology, and the rapid pace of development. Even at its best, compliance-based governance ignores more structural questions of power, justice, and human flourishing.

    3. Undermining the Public-Private Partnership

    For it to work the way it is supposed to, compliance-based governance assumes that regulators’ toolkits and expertise are insufficient.[284] A traditional regulator might use a command-and-control approach where the state can ban products outright, place limits on behaviors, and hold industry accountable through court orders and litigated claims.[285] But a compliance-based model, where industry is responsible for its own ongoing monitoring, suggests this approach is ineffectual and limited. The private sector, proponents say, has technical expertise that government does not.[286] A command-and-control approach also raises a “pacing problem” where top-down regulation cannot keep up with fast-changing technologies.[287] Therefore, compliance-based governance purports to bring “private sector expertise in[to] governance.”[288] It is also supposed to bring new enforcement mechanisms to regulators’ command-and-control toolkits of rules and government enforcement agents.[289] The compliance model implies that if toolkits were sufficient, there would be no need for the nimbleness, flexibility, and speed—not to mention the input and expertise from private industry—that compliance-based governance brings to the information economy.

    Compliance-based practices—impact assessments, compliance structures, self-audits and self-assessments, codes of conduct, industry self-certifications, settlements, and consultations—are performative because they construct regulatory institutions that require those practices. The expectation that industry will bring its own experts to the table disincentivizes the government to fund the FTC’s own experts. If regulated entities are hiring assessors and conducting audits by executive attestation on their own, the FTC does not need its own army of auditors and monitors to do the same job. And if most cases settle, Congress has an excuse to withhold the funding and staffing the FTC might need to litigate more claims. By making industry a partner in regulation, the compliance model explicitly and intentionally redistributes regulatory duties, relieving government of burdens, but also normalizing the idea that government does not and should not have to perform traditional regulatory responsibilities. Industry is there to help.

    Many other legal institutions are transforming themselves in the image of the compliance model. Industry input is engrained in modern environmental, health, and safety law,[290] with regulators often considering market costs in regulatory decision-making.[291] Financial regulation in the wake of the 2008 Financial Crisis relies on audits, independent committees, and other internal structures that amount to outsourcing regulation to regulated entities themselves.[292] Compliance-based regulation and managerialization have similarly expanded the importance of employer-friendly arbitration and played a crucial role in justifying forced arbitration clauses in employment contracts.[293] And, as Lauren Edelman has shown, the corporate practices associated with Title VII—policy statements, diversity offices, bias training, and internal appeals—have performatively constructed what courts perceive anti-discrimination law to be.[294]

    Scholars have argued this kind of hollowing out of traditional regulatory functions is the product of neoliberal hegemony.[295] That is undoubtedly true. Procedural governance in environmental, health, and financial regulation law may also reflect the performativity of compliance practices on the ground. Put another way, we have come to expect that regulation is a public-private partnership in which industry manages much of its own compliance. Therefore, the compliance model has created legal institutions in its own image.

    But this erosion of public institutional power undermines the very mechanisms that are supposed to help compliance-based governance guard against its own devolution into regulatory capture and self-regulation. As the compliance model’s proponents concede, compliance-based governance is subject to the risk of capture, because regulated companies themselves are creating compliance tools and participating in their own regulation.[296]Accordingly, effective governance presupposes the existence of a robust and effective regulator that is capable and prepared to act as a “backdrop threat” to ensure that industry is an honest partner as it works with public institutions to achieve social goals.[297] But, as noted above, one of the performative aspects of the model is the construction of public regulatory institutions that depend on industry expertise, input, capital, and workers to fulfill regulatory responsibilities. This dependence not only creates managerialized public institutions, but it also weakens the ability of government regulators to adequately function as enforcers ready to bring down the hammer of command-and-control if industry’s compliance programs fail to rein in data-extractive practices.

    However, the prospect of tethered regulatory agencies is far more likely than proponents suggest. When scholars describe the compliance model’s diverse toolkit—from impact assessments to trainings and audits—they again make the epistemic error of considering the toolkit in a vacuum, divorced from the social context in which that toolkit is used. But compliance practices are not theories; they operate within organizational bureaucracies created to routinize productivity and profit.[298] Those bureaucracies can subordinate privacy structures to undermine accountability in any number of ways. Many companies push their CPO down the corporate hierarchy or subordinate them within risk management or compliance departments, forcing privacy to fight within systems focused on achieving substantially different goals.[299] Companies also shift control of privacy budgets to legal, compliance, or technology departments.[300] They also sideline privacy work. In self-reported surveys, privacy leaders report the greatest control over trainings, drafting policies, publications, communications, and travel, but far less responsibility for the practices that really matter in compliance-based governance: audits, data inventory, and technology.[301] Management also creates siloed privacy departments that appear robust, but have little impact on the company’s work.[302] Therefore, privacy law’s reliance on privacy professionals—even those who consider themselves privacy advocates—doing work in the public interest is misplaced. Companies are already exercising their financial and structural power to co-opt internal privacy advocates and turn their efforts away from meaningful privacy work.[303]

    The information industry also routinely fires dissident employees, creating a chilling effect on others trying to push against the data-extractive tide. In August 2020, for example, Buzzfeed reported that Facebook punished a senior engineer for collecting evidence showing the company gave preferential treatment to conservative accounts.[304] Another Facebook employee who gathered evidence that the social network protected right-wing websites from the company’s policies on misinformation had their internal access revoked, as well.[305] Google took the same approach to its employees who blew the whistle on the company’s efforts to suppress unionization, its cozy relationship with outside advisers with long histories of homophobic and racist comments, and its entanglement with Customs and Border Protection.[306] Google even fired the prominent AI researcher Timnit Gebru for trying to publish a paper on language algorithms that threatened the company’s bottom line.[307] This job insecurity has a chilling effect on tech-sector managers, dissuading them from speaking privacy truths to data-extractive power.[308]

    Any one of these constraints—weakened privacy offices, precarity of employment, and siloization, alone, or in concert—weaken privacy law. Privacy departments that are siloed, starved for cash, and organizationally subservient to business units with independent or contrary interests have weaker voices in making policy. When advocates for accountability are fired, others may go silent. As a result, corporate obligations are framed in terms dictated by more powerful organizational actors, whether that is the general counsel, whose job it is to minimize legal risks to the company, or the vice president for technology, whose job it is to define the technical aspects of corporate practice. Neither of these actors is necessarily an active and overt anti-privacy voice. But the perspectives, goals, and metrics on which they are judged by their company are orthogonal to privacy and far more managerial. This makes it more likely that internal compliance practices will be framed and cabined to serve corporate interests rather than social and policy goals.

    This creates a downward spiral. Compliance governance practices hollow out regulatory institutions by normalizing the expectation that industry will fill in gaps left open by underfunded, slow-moving, and untrained public regulators. At the same time, it relies on internal corporate structures that are not independent of industry, but rather entirely controlled and subordinated by industry bureaucracies that can easily game the system. In this world, there are no honest partners and no backdrop threats. There is only self-regulation and symbolic compliance.

    IV. A Framework for Resistance

    Privacy law’s social practices, including many that long predate the GDPR and the CCPA, should be understood as expressive performances that have socially constructed what we think privacy law is and should be.[309] Surfacing the performative aspects of privacy law practices may help explain why so many recent privacy proposals look so similar and why most of them will likely prove ineffective at protecting our privacy. Ever since the FTC started requiring privacy offices and programs alongside notice-and-consent, internal compliance and self-governance have socially constructed the category of privacy law and crowded out other options. But this status quo is insufficient to adequately serve privacy interests. It is too reliant on corporate goodwill and destructive to public governance. It is susceptible to gaming and internally inconsistent. And it perpetuates a misleading vision of the autonomous capacities of individual subjects to protect their privacy in a data-extractive economy. The world it creates is detrimental to privacy.

    This Section suggests a radical alternative. Because it is difficult to escape the normalizing capacities of performative practices,[310] this Section provides a framework for thinking about, and developing, new discursive and behavioral performances that destabilize existing routines and generate democratic institutions of privacy governance.[311] By democratic, I mean that the information economy should be accountable “to those who live” within it.[312] And we should recognize that privacy law is not simply an exogenous institution that sets rules of the game for data use. As the economist Robert Hale noted, “the law confers on each person a wholly unique set of liberties with regard to the use of material goods and imposes on each person a unique set of restrictions with regard thereto.”[313] In other words, the legal constructions of informational capitalism allocate market power, choosing winners and losers along the way.[314] By disclaiming any interest in the substantive rights and burdens of the information economy, the rights/compliance model has chosen industry over individuals, market actors over market subjects, and capital over consumers.

    Following the work of the social philosopher André Gorz, I propose alternative performances focused not on protecting the current “needs, criteria, and rationales” of informational capitalism, but rather on “what should be,” and the “fundamental political and economic changes” needed to turn what ought to be into what is.[315] In other words, I propose a series of “non-reformist reforms” or, non-reformist performances: practices on the ground that aim not at mere tinkering with the rights/compliance model, but rather aim at fundamentally transforming the relationship between individuals and technology companies.[316] I also borrow from the law and political economy literature to define this new framework in terms of three overlapping values: power, equality, and democracy.[317] Notably, there is no magic bullet or single set of proposals that will inevitably move us toward a fairer future. Progress is contingent, halting, and uncertain. But we must start somewhere.

    A. Non-Reformist Performances

    Gorz saw non-reformist reforms as a way to build a better world today, while preparing for the world we want tomorrow. Reforms are “non-reformist” when they help bring about radical change.[318] Popular social movements could wait for structures of oppression to collapse under their own contradictions, shying away from incremental reforms within current systems of power for fear of legitimizing the systems and delaying real social transformation. Or, they could build both better lives and greater consciousness for the people along the way to structural change. Non-reformist reforms do the latter.

    Amna Akbar’s three essential characteristics of non-reformist reforms explain how to achieve this better world.[319] First, non-reformist reforms are never end goals; they are means to a transformative future. They are based not on a technocrat’s assessment of what industry, or those in power, think is possible under the current regime. Rather, non-reformist reforms are meant to take us closer to what should be possible.[320] Second, non-reformist reforms are always pathways for “building ever-growing organized popular power.”[321] This is as much about process as it is about substance. Non-reformist reforms come from social movements fighting for them, rather than being meted out by those in power.[322] The latter strengthens the system that disenfranchises social movements and ordinary people, while the former recenters power. Finally, non-reformist reforms are never singular answers to discrete policy questions.[323] They always aim at building popular power and, therefore, are part of a “broader array of strategies . . . for political, economic, [and] social transformation.”[324] Non-reformist reforms are about “deepening consciousness, building independent power and membership, and expanding demands” all at the same time.[325] They are not about targeting a single issue at the expense of other social demands, values, and visions.[326]

    Consider a pay raise for union workers. A reformist reform is a raise granted by management, at their behest and by their largesse; a non-reformist reform is a raise won through struggle, protest, and activism, a process that awakens workers to their own power. A reformist reform is a raise that results when raises, however high, are the workers’ ultimate goal; a non-reformist reform is a raise that opens the door for more demands, more struggles against power, and greater consciousness among workers of the system’s subordination of its workforce. And a reformist reform is a raise that stands on its own; a non-reformist reform, by contrast, is a raise that is part of a larger ecosystem of structural change aimed at empowering workers.

    Gorz saw non-reformist reforms as ways of changing how the disempowered behave both amongst themselves and toward those in power. That focus on behavioral change surfaces a connection between non-reformist reforms and performativity. As discussed, our practices can be performative and habituating, locking the disempowered into non-existent, unsuccessful, or short-lived struggles against power. Therefore, non-reformist performances seek to empower individuals, materially improve the people’s position vis-à-vis technology companies, and raise collective consciousness about data-driven oppression and the rights/compliance model’s complicity in the public’s subordination to informational capitalism. And, like Judith Butler’s theory of performativity itself, non-reformist performances start with discourse.[327]

    B. Privacy Discourse

    The theory of privacy-as-control embedded in the rights/compliance model maintains current structures of power. That is, although it seems empowering to be told that we should have control over when, how, and to whom we disclose our information, the reality is darker. As we have seen, performing individual rights of control habituates us into a false sense of control while technology companies weaponize our exercise of individual rights to immunize themselves from legal accountability.[328] New discursive performances can start the process of advancing social values over industry interests and raise popular consciousness in the process.

    There is already a rich body of privacy scholarship eschewing the individual-focused discourses of control and choice. For example, some scholars talk about privacy in terms of loyalty.[329] Others argue that privacy is about the flow of information through and among social networks.[330] Helen Nissenbaum has focused privacy around “context-relative informational norms” that “govern[] the flows of personal information” in distinct social contexts, such as education, health care, and politics.[331] Julie Cohen has offered an even more robust conception of privacy. She argues that “[p]rivacy . . . protects the situated practices of boundary management through which the capacity for self-determination develops.”[332] Neil Richards argued that “privacy is about the rules governing the extent to which human information is detected, collected, used, shared, and stored and how those activities can be used to affect our lives.” [333]

    But we can go further. Although these approaches to privacy are not centered solely on the individual and, therefore, do not perpetuate the idea that privacy is something we must govern ourselves, they are still agnostic as to ends. Some privacy scholarship is taking this next step. Danielle Citron has called for giving special weight to, and protection for, sexual privacy, pushing back against corporate surveillance of our sexuality, bodies, and intimate selves.[334] Her work takes an explicitly normative turn by elevating sexual privacy as far more worthy of legal protection than the profit-making whims of a company that thinks extracting data from intimate applications and pornography websites is the path to wealth.[335] For sexual privacy, procedure is not enough. Virginia Eubanks called for special attention to protecting the privacy of those on public assistance, in the child welfare system, and those who are unhoused.[336] Scott Skinner-Thompson argued that privacy law should adopt an anti-subordination approach that would protect the rights of the most vulnerable.[337] And Khiara Bridges expressed that privacy law should address structural socioeconomic inequality.[338] We could also think about privacy as a necessary element of human flourishing, or the realization of the whole person, including our physical well-being, happiness, self-determination, and more.[339] We need to quit thinking and talking about privacy in terms of choice and control, full stop. By leveraging the performative capacities of discourse—which is well underway in legal academia—we can change baseline assumptions about what privacy is for.

    What if scholars and advocates started talking about privacy almost exclusively in terms of emancipation? Privacy is more than just a set of rules or a series of processes or even a set of norms. Privacy is a state of freedom from overlapping forms of subordination: corporate, institutional, and social. Privacy’s emancipatory capacities underly Professor Citron’s call for sexual privacy, which, if fully protected, would liberate women, LGBTQ+ people, and sexual minorities from oppressive social and institutional structures.[340] Emancipation sits at the center of Salomé Viljoen’s call for democratizing data governance to liberate people from a system of datafication that enacts, reifies, and amplifies unjust and unequal social relations.[341] Scholars and advocates should adopt this language when speaking and thinking about privacy. Doing so will contribute to new ways of thinking about the role of privacy law, privacy litigation, and privacy wrongs.

    C. Power and Policy

    New discourses are important, but they only begin a process of countering privacy law’s pro-industry performances. We should think about the kind of privacy performances we want in terms of power: to whom do they allocate power, from whom do they take power, and against whom is the law weaponized? To date, privacy law discourses and behaviors have empowered industry to extract our data for profit with limited accountability. We can change that by redistributing power to the rest of us.[342]

    Redistributing power means regulators will have to undertake different performances. Instead of partnering with industry, conceptualizing their regulatory role as industry partners, and occasionally requiring companies to pay compensatory fines, regulators must recognize that the data-extractive harms caused by industry are metastatic.[343] For example, Amazon agreed to pay $61.7 million in a settlement with the FTC, a number derived from adding up the precise amounts of tips the company stole from its delivery drivers over two years.[344] At less than 0.015 percent of the company’s revenue in a single year, the fine is neither likely to have any material effect on Amazon nor deter future mischief.[345] But the harm Amazon caused to workers exceeds the lost compensation. Amazon’s growth and profit stem from a business model that places impossible demands on underpaid workers while maintaining strict surveillance of worker life. Amazon workers cannot leave their posts to use the restroom; the company pays particularly low wages.[346] An investigation into Amazon’s employment practices demonstrated that the company engages in a series of tactics, like siphoning tips, not simply to nickel-and-dime workers, but to encourage employees to leave, keeping wages down.[347] Surveillance keeps employees afraid. Stealing tips is part of a patchwork of strategies subordinating workers.[348]

    Data processing harms also metastasize for users. The FTC fined Facebook $5 billion for its role in the Cambridge Analytica scandal, but it has had little effect.[349] Individual users were subject to manipulation by Cambridge Analytica because of how social networks function, the lack of regulation over what it means to “consent” to terms of service, and the capacity of data processing to create relational harms.[350] Facebook’s fine was accompanied by marginal changes in what third-party apps can do, but the company has not changed the underlying data processing mechanisms that subjected millions of users to Cambridge Analytica’s data misuse.[351]

    Regulators need new performances, ones that regulate business models based on subordinating workers and users rather than individual practices of oppression and data-extraction in isolation. Regulators will then be habituated into seeing their role as working on behalf of individuals to counter corporate power. The Department of Justice (DOJ) should be empowered to hold industry executives personally liable when they lie or mislead regulators in corporate privacy assessments.[352] In terms of new regulatory practices, many privacy advocates, and at least one current FTC Commissioner, have called on the FTC to litigate claims more often.[353] Congress must also empower the FTC to pursue more robust remedies, including disgorgement, to deter wrongful conduct by forcing defendants to give up profits derived from their illegal behavior.[354] Amazon did not just steal $61.7 million from its drivers; it also derived enormous profits from a booming delivery market during the COVID-19 pandemic in which it underpaid its workers while promising otherwise.[355] A percentage of customers likely used Amazon’s services based on that promise.[356]

    Since disgorgement of ill-gotten profits may have a stronger effect on corporate behavior, a similar model could rein in data misuse. Indeed, disgorgement need not only apply to money. Data collection feeds algorithmic processes that target individuals with advertisements; behavioral targeting, in fact, is at the core of the Internet business model. If microtargeted algorithms are the products of improper data collection, then the algorithms themselves are ill-gotten gains, and should be similarly disgorged. FTC Commissioner Rebecca Slaughter has already hinted that this would be a welcome shift in regulators’ practices.[357]

    We must also redistribute power away from the information industry by facilitating critical research about data-extractive technologies. Making radical changes in trade secrecy laws is an obvious first step.[358] But given industry’s current monopoly over the raw data necessary to assess technology’s social effects, the mass unionization of technology researchers employed by industry can shift power to those seeking to pull back the veil on corporate misdeeds. Google’s summary firing of Timnit Gebru suggests that corporate-funded information research is not independent.[359] In Gebru’s situation, a union could have acted as a check against retaliation, discrimination, or forcing internal technology researchers to “strike a positive tone” in their work.[360] Organized and empowered employees could push back on corporate development of technologies that harm marginalized populations. The rights/compliance model assumes that in-house compliance and privacy professionals will play the role of the privacy advocate. That is unlikely, given ordinary workplace pressures facing in-house compliance professionals.[361] A union for technology workers doing important research on information economy harms may help. In the spirit of non-reformist performances, the activism and struggles of unionization can also awaken technology company workers to their exploitation within organizational structures and their role in designing products explicitly aimed at extracting data and profits from subordinated consumers.[362]

    The rights/compliance model of governance provides “rules of the game” without committing companies or society to any particular ends.[363] A radically different approach would create performances based on the principle of equality, or the basic notion that information systems should not create or entrench “social subordination.”[364] That can start with changing how we make privacy law.

    Today, regulators and policymakers seek industry input.[365] They should instead give advocacy organizations representing marginalized populations, and not corporations, a seat at the table. Groups focused on the cyber civil rights of women, the poor, communities of color, survivors of intimate partner violence and nonconsensual pornography, sex workers, those living with disabilities, HIV+ individuals, and those who identify as LGBTQ+, among many others, may have unique perspectives on data use, its dangers, and its downstream consequences.[366] Those most likely to be subordinated by data practices should be in the room; those most likely to subordinate others should not be.[367] They may not always agree or have a single message, but they certainly have claims to seats at the table that are currently given to industry by default.

    One of the results of decentering the needs of industry in privacy law is an emphasis on cyber civil rights.[368] Senator Sherrod Brown’s bill, the Data Accountability and Transparency Act (DATA) of 2020, comes closest among recent proposals to doing this. Although the draft bill retains some of the rights/compliance framework, it creates an office of civil rights that would ensure data collection and use is “fair, equitable, and nondiscriminatory.”[369] The proposal would prohibit any data aggregation that results in discrimination in housing, employment, credit, insurance, and public accommodations, or that has a disparate impact on marginalized populations.[370] It also makes it easier for victims to prove, and obtain justice for, disparate impact.[371] Of course, DATA is not immune from any of the problems discussed throughout this Article. But non-reformist reforms are consciously imperfect. DATA nods to the population-level harms that are endemic to business models dependent upon data-driven behavioral targeting. It is worth noting that in drafting his proposal, Senator Brown consulted exclusively with representatives of civil society and not with industry.[372] Senator Brown’s decision to focus on equality, rather than on what corporations would accept, is a welcome model for new privacy performances.

    Frank Pasquale also has a provocative proposal for “ex ante licensing of large-scale data collection . . . in jurisdictions committed to enabling democratic governance of personal data.”[373] Pasquale proposes a stricter version of Senator Brown’s DATA that would require data brokers to obtain a license from the government in order to process large data sets of personal information.[374] This proposal sounds radical, but the notion that some information is too sensitive to use for business purposes is commonplace. For instance, we criminalize the dissemination of a person’s bank account information, and universities require researchers to obtain approval before engaging in any human-subject research.[375] In other words, we place limits on gathering and sharing information about real people all the time because we are concerned about both the downstream effects and social values that are lost if we did not. Pasquale argued that an ex ante licensing regime would be the only way to protect the population, particularly the most marginalized, from “systematic efforts to typecast individuals, to keep them monitored in their place, or to ransack databases for ways to manipulate them.”[376] Managerialized compliance cannot do this, nor does it even try. It is content with managing data collection and trying to regulate it ex post, after it is used and after it likely has already had an effect on social life.

    D. Democracy and Protest

    Non-reformist reforms come from the people, through struggle, and via the power of social movements. Therefore, alternative privacy performances must be part of broader social movements for structural change. The rights/compliance model for privacy law is the opposite. As this Article has shown, the model is the product of the practices of industry, designed to keep corporations in power while providing the veneer of accountability to silence and demoralize advocates for change.[377] Successful non-reformist reforms hinge on people power. Real change in our relationship with the information industry can only come if we fight for it, raising consciousness about our collective power in the process.

    Informational capitalism does not make that easy. Our economy is built to hide its horrors. Industrial capitalism left scars—soot, illness, death; informational capitalism leaves few scars as visible on the surface, but the wounds are still deep.[378] Privacy and data breach harms are often intangible,[379] and the law’s entanglements with industry are invisible.[380] The closest social movements have come to a galvanizing ground-up campaign for data justice is “Fuck the Algorithm,” a slogan used by a group of high school students in the United Kingdom whose algorithmically-determined final grades were lower than their performance suggested they should be.[381] The intangible nature of privacy harms and the entanglements of industry make it harder for privacy to galvanize popular movements for structural reform. That must change. Advocacy groups of all sorts must make privacy a centerpiece of their activist platforms. After all, privacy has always been a matter of gender equality,[382] racial justice,[383] and LGBTQ+ liberation.[384]

    That these ideas—cyber civil rights, data licensing, research funding, unionization, and “egalitarian” free speech—do not seem to fit within privacy law’s traditional purview speaks to the myopia that has characterized privacy law to date. Privacy law is not merely about data. It is also about the effects on society of data and data use. The narrowness of the rights/compliance model has benefited industry at our expense by disaggregating values from data governance. By focusing primarily on data collection and data management, traditional privacy law is siloed from the social contexts affected by data collection and use. New performances can help us find a different way.

    Conclusion

    This Article challenges a growing conventional wisdom in privacy scholarship. That narrative—namely, that the E.U.’s or California’s legal entrepreneurship explains today’s interest in privacy legislation among U.S. policymakers—is fundamentally flawed. It elides the fact that many privacy law practices codified in these new laws predate both the GDPR and the CCPA. It is also based on a fundamental misconception of law as autonomous from its social context.

    In place of this top-down narrative, this Article relies on sociological and critical studies literatures to argue that recent proposals for comprehensive privacy law adopt roughly similar rights/compliance approaches because long-standing privacy law practices are performative. The routinized performance of internal privacy offices, impact assessments, audits, record-keeping, regulatory-industry partnerships, and privacy self-management has socially constructed privacy law from the ground up. Put another way, we think privacy should look a certain way because we are accustomed to doing it that way. Unfortunately, we are acculturated to a privacy regime that actually undermines privacy. Following the emerging law and political economy research agenda, the Article proposes a framework based on principles of power, equality, and democracy. The framework offers some alternative performances that can tear down barriers to accountability, break up conventional routines, and destabilize industry’s asymmetrical power.

    This Article has also made independent contributions to legal theory and has implications beyond privacy law. Performativity in legal scholarship has been exclusively used in the traditional sense of performing identity. This Article suggests that we can also perform—and, therefore, socially construct—entire legal regimes through our actions and discourses. That has important implications for a variety of legal fields. It also gives us a path forward. Privacy advocacy may sometimes seem like tilting at windmills, but all we need are new performances. Given the centrality of privacy in today’s political debates, we now have a unique opportunity to shift the course of privacy law from its inadequate past to a new, democratic future.

    © 2022 Ari Ezra Waldman, Professor of Law & Computer Science and Faculty Director, Center for Law, Information, and Creativity, Northeastern University. Ph.D., Columbia University; J.D., Harvard Law School. This project is the third in a series of projects about the social practices of privacy law, culminating in a book: Industry Unbound: The Inside Story of Privacy, Data, and Corporate Power. This Article benefited from comments and suggestions from participants at faculty colloquia at Northeastern University School of Law, Northeastern University College of Social Sciences and Humanities, Georgetown University Law Center, the University of Colorado School of Law, Yale Law School, Washington University in St. Louis, and Harvard University. Special thanks to Sophia Baik, Ryan Calo, Danielle Citron, Julie Cohen, Yan Fang, Andrew Gilden, Jeff Gary, Woodrow Hartzog, Margot Kaminski, Cameron Kerry, Mihir Kshirsagar, Filippo Lancieri, Bill McGeveran, Laura Moy, Frank Pasquale, Jon Penney, Neil Richards, Paul Schwartz, Daniel Solove, Evan Selinger, Salomé Viljoen, and Felix Wu. Margaret Foster provided essential research assistance. I performed all errors on my own. DOI: https://doi.org/10.15779/Z38JD4PQ3D.

    1. See Consumer Data Privacy and Security Act of 2021, S. 1494, 117th Cong.; Data Care Act of 2021, S. 919, 117th Cong. (2021); Information Transparency & Personal Data Control Act, H.R. 1816, 117th Cong. (2021); Data Accountability and Transparency Act of 2020 (DATA), 116th Cong. (2020) [hereinafter DATA] (distributed as discussion draft); Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act (SAFE DATA Act), S. 4626, 116th Cong. (2020) [hereinafter SAFE DATA Act]; American Data Dissemination Act of 2019 (ADD Act), S. 142, 116th Cong. (2019); Consumer Online Privacy Rights Act (COPRA), S. 2968, 116th Cong. (2019) [hereinafter COPRA]; Data Care Act of 2019, S. 2961, 116th Cong. (2019); Mind Your Own Business Act of 2019 (MYOBA), S. 2637, 116th Cong. (2019) [hereinafter MYOBA]; Online Privacy Act of 2019, H.R. 4978, 116th Cong. (2019); Privacy Bill of Rights Act, S. 1214, 116th Cong. (2019) [hereinafter Privacy Bill of Rights]. A discussion draft was introduced recently. Discussion Draft, A Bill to Provide Consumers with Foundational Data Privacy Rights, Create Strong Oversight Mechanisms, and Establish Meaningful Enforcement, 117th Cong., 2d Sess. (2021), https://energycommerce.house.gov/sites/democrats.energycommerce.house.gov/files/documents/Bipartisan_Privacy_Discussion_Draft_Bill_Text.pdf [https://perma.cc/LM9J-9QKV].
    1. See California Privacy Rights Act of 2020 (codified as amended at Cal. Civ. Code § 1798.100–1798.199.100); 52 Nev. Rev. Stat. § 603A (2020); H.R. 216, 2021 Leg., Reg. Sess. (Ala. 2021); S. 21–190, 73d Gen. Assemb., 1st Reg. Sess. (Colo. 2021); S. 893, 2021 Gen. Assemb., Jan. Sess. (Conn. 2021); H.R. 3910, 102d Gen. Assemb., 1st Reg. Sess. (Ill. 2021); S. 46, 192d Gen. Ct., Reg. Sess. (Mass. 2021); S. 567, 2021 Leg., 244th Reg. Sess. (N.Y. 2021); A. 6042, 2021 Leg., 244th Reg Sess. (N.Y. 2021); S. 6701, 2021 Leg., 244th Reg. Sess. (N.Y. 2021); S. 569, 2021 Gen. Assemb., 2021 Sess. (N.C. 2021); H.R. 1126, Gen. Assemb., 2021 Sess. (Pa. 2021); H.R. 3741, 87th Leg., Reg. Sess. (Tex. 2021); S. 1392, 2021 Gen. Assemb., 1st Spec. Sess. (Va. 2021); S. 5062, 67th Leg., 2021 Reg. Sess. (Wash. 2021); S. 1614, 54th Leg., 2d Reg. Sess. (Ariz. 2020); H.R 2729, 54th Leg., 2d Reg. Sess. (Ariz. 2020); H.R. 963, 26th Leg., Reg. Sess. (Fla. 2020); S. 2330, 101st Gen. Assemb., 1st Reg. Sess. (Ill. 2020); H.R. 5603, 101st Gen. Assemb., Reg. Sess. (Ill. 2020); H.R. 784, 2020 Gen. Assemb., 441st Sess. (Md. 2020); H.R. 1656, 2020 Gen. Assemb., 441st Sess. (Md. 2020); H.R. 3936, 91st Leg., 91st Sess. (Minn. 2020); L. 746, 106th Leg., 2d Reg. Sess. (Neb. 2020); H.R. 1236, 2020 Gen. Ct., 166th Sess. (N.H. 2020); Assemb. 3255, 219th Leg., 1st Ann. Sess. (N.J. 2020); H.R. 473, 2020 Gen. Assemb., 2020 Sess. (Va. 2020); S. 418, 30th Leg., Reg. Sess. (Haw. 2019); S. 2263, 101st Gen. Assemb., 1st Reg. Sess. (Ill. 2019); S. 946, 129th Leg., 1st Reg. Sess. (Me. 2019); H.R. 1253, 2019 Leg., 2019 Reg. Sess. (Miss. 2019); S. 176, 54th Leg., 1st Sess. (N.M. 2019); S. 224, 2019 Leg., Reg. Sess. (N.Y. 2019); S. 5642, 2019 Leg., Reg. Sess. (N.Y. 2019); H.R. 1049, 203d Gen. Assemb., 2019 Sess. (Pa. 2019); S. 234, 2019 Gen. Assemb., Jan. Sess. (R.I. 2019); H.R. 4390, 86th Leg., Reg. Sess. (Tex. 2019); H.R. 4518, 86th Leg., Reg. Sess. (Tex. 2019); Assemb. 2188, 219th Leg., 2020 Sess. (N.J. 2020); S. 2834, 218th Leg., 1st Ann. Sess. (N.J. 2018).
    1. .See Council Regulation (EU) 2016/679, of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46, 2016 O.J. (L 119) [hereinafter GDPR]. The GDPR applies to U.S. companies in certain circumstances, so it is relevant for assessing the privacy law landscape even outside the E.U.. See id. at art. 3(2)(a)–(b); Cal. Civ. Code §§ 1798.100–1798.199.100 (2018 Cal. Legis. Info.) [hereinafter CCPA].
    1. .Margot E. Kaminski, Binary Governance: Lessons from the GDPR’s Approach to Algorithmic Accountability, 92 S. Cal. L. Rev. 1529, 1559–62 (2019). As described in more detail below, I disagree with other scholars’ descriptive analyses of these privacy proposals. Anupam Chandar, Margot E. Kaminski, and William McGeveran do not characterize proposed U.S. state laws as having a rights/compliance model. See Anupam Chander, Margot E. Kaminski & William McGeveran, Catalyzing Privacy Law, 105 Minn. L. Rev. 1733, 1733. Instead, they see them as largely individual rights-based only laws. That is true from a law-on-the-books perspective: the laws do not require extensive internal compliance like the GDPR. But as described in Part II.B.2, the proposals do require privacy impact assessments and any regime that guarantees rights also requires a company to build forms, evaluate data requests, and set up appeals processes. These are internal compliance structures. See id. at 1736 (2021) (arguing that recent proposals in the United States “differ[] significantly—and consciously—from the European model.”).
    1. .New State Ice Co. v. Liebermann, 285 U.S. 352, 387 (Brandeis, J., dissenting) (referring to a state as a “laboratory” of policy experimentation); see generally James E. Campbell, Polarized: Making Sense of a Divided America (2016) (analyzing political polarization in the United States.).
    1. See, e.g., Woodrow Hartzog, Privacy’s Blueprint 93–157 (2018) (calling for a “design agenda for privacy law” that leverages various legal regimes to ensure that privacy is designed in, and manipulation is designed out of, information technologies); Neil Richards & Woodrow Hartzog, A Duty of Loyalty for Privacy Law, 99 Wash. U. L. Rev. 961, 1003–12 (2021), (detailing what an information fiduciary model of governance would look like in practice); Jack M. Balkin, Information Fiduciaries and the First Amendment, 49 U.C. Davis L. Rev. 1183, 1205–09 (2016) (justifying imposing fiduciary duties on online service providers).
    1. See Woodrow Hartzog & Neil Richards, Privacy’s Constitutional Moment and the Limits of Data Protection, 61 B.C. L. Rev. 1687, 1714, 1721–37 (2020).
    1. See Anu Bradford, The Brussels Effect, 107 Nw. U. L. Rev. 1, 22–26 (2012) (suggesting that the E.U.’s ban on transfers of data to countries without adequate levels of protection would catalyze a race to the top to mimic the GDPR); Chander et al., supra note 4, at 1767 (arguing that new U.S. proposals are different from the GDPR and instead reflect the unique ways in which the CCPA was the product of norm entrepreneurship that harnessed state legislative processes to produce the law).
    1. See, e.g., Bradford, supra note 8. This tranche of scholarship erroneously implies that law is an institution exogenous to society. See Patricia Ewick & Susan S. Silbey, The Common Place of Law: Stories from Everyday Life 15–32 (1998) (suggesting that law is social in nature in that it is present in everyday social experiences); Roger Cotterrell, Why Must Legal Ideas Be Interpreted Sociologically, 25 J.L. & Soc’y 171, 172–73 (1998) (arguing that sociological interpretations of legal institutions can help understand the meaning of law itself); Julie E. Cohen, Between Truth and Power: The Legal Constructions of Informational Capitalism 3–8 (2019); Morton J. Horwitz, The Transformation of American Law 1870-1960 (1992); Karl Polanyi, The Great Transformation: The Political and Economic Origins of Our Time (2d ed. 2001).
    1. See Chander et al., supra note 4, at 1790.
    1. .“Data-extractive capitalism” refers to a particularly oppressive and dominating form of informational capitalism, an economic system in which data is processed to derive insights about individuals for profit. See Cohen, supra note 9, at 3–5.
    1. See Decision and Order, Google, Inc., FTC Docket No. C-4336, at 4 (Oct. 13, 2011) [hereinafter Google Consent Decree], https://www.ftc.gov/sites/default/files/documents/cases/2011/10/111024googlebuzzdo.pdf [https://perma.cc/W594-63YA] (requiring Google to develop a “comprehensive privacy program”).
    1. See U.S Dep’t of Health, Educ. & Welfare, Records, Computers and the Rights of Citizens: Report of the Secretary’s Advisory Committee on Automated Personal Data Systems (1973) (describing the “Fair Information Practice Principles” (FIPP), which originally included rights to notice, access, correction, and reasonable security); Woodrow Hartzog, The Inadequate, Invaluable Fair Information Practices, 76 Md. L. Rev. 952, 957–59 (2017) (describing how the original FIPPs included more than just a right to notice).
    1. See Erving Goffman, The Presentation of Self in Everyday Life 15 (1959) (defining performance as “all the activity of a given participant on a given occasion which serves to influence in any way any of the other participants”). Richard Schechner defined performances as “twice-behaved” or “restored” behavior. Richard Schechner, Performance Studies: An Introduction 28–29 (3d ed. 2013). For a more detailed discussion of performance theory, see infra Part I.A.
    1. See Goffman, supra note 14, at 15–23.
    1. See Judith Butler, Bodies that Matter: on the Discursive Limits of “Sex” 2 (1993); Jacques Derrida, Margins of Philosophy 307–30 (Alan Bass trans. 1982) (discussing how repeated expressive acts can create forms of identity); The Laws of the Markets (Michel Callon ed., 1998).
    1. See generally Ari Ezra Waldman, The New Privacy Law, 55 U.C. Davis L. Rev. Online 19 (2021) (classifying the evolution of privacy law in terms of “waves” based on the periodization from feminist literature).
    1. See Cohen, supra note 9, at 5.
    1. .Kaminski, supra note 4, at 1577.
    1. Id. at 1561. Notably, Kaminski argued that the GDPR does not adequately create and sustain this backdrop threat because of a lack of accountability, transparency, and civil society input.
    1. See Butler, supra note 16, at x–xi, 7 (noting that there is no identity before performance).
    1. See Jedediah Britton-Purdy, David Singh Grewal, Amy Kapczynski & K. Sabeel Rahman, Building a Law-and-Political-Economy Framework: Beyond the Twentieth-Century Synthesis, 129 Yale L.J. 1784, 1789–90 (2020).
    1. .André Gorz, Strategy for Labor: A Radical Proposal 7 (Martin A. Nicolaus & Victoria Ortiz trans., 1967).
    1. See J. L. Austin, How to Do Things with Words 3, 10, 12 (1962). When Austin talked about performances, he was talking about speech: he used the example of saying “I do” during wedding ceremonies as a statement that does more than just express a sentiment. Id. at 12–13. Its utterance creates the marriage; the words made the marriage a reality and was, thus, performative. Id. This Article is about the performative capacities of practices, not exclusively speech.
    1. .Judith Butler, Performative Acts and Gender Constitution: An Essay in Phenomenology and Feminist Theory, 40 Theatre J. 519, 524–26 (1988) [hereinafter Performative Acts].
    1. .Judith Butler, Gender Trouble: Feminism and the Subversion of Identity 142–45 (1990).
    1. See Andrew Parker & Eve Kosofsky Sedgwick, Introduction to Performativity and Performance 2 (Andrew Parker & Eve Kosofsky Sedwick eds., 1995) (noting that performances can create social meaning to the self and others).
    1. See, e.g., Camille Gear Rich, Performing Racial and Ethnic Identity: Discrimination by Proxy and the Future of Title VII, 79 N.Y.U. L. Rev. 1134, 1158–65, 1171–85 (2004).
    1. See Devon W. Carbado & Mitu Gulati, The Fifth Black Woman, 11 J. Contemp. Legal Issues 701, 710–19 (2001); Nancy Leong, Identity Entrepreneurs, 104 Calif. L. Rev. 1333, 1386–87 (2016). The literature on the performance of race is substantial. See, e.g., Ian Haney Lopez, White By Law: The Legal Construction of Race (Richard Delgado & Jean Stefancic eds., 1996); Kenneth W. Mack, Representing the Race: The Creation of the Civil Rights Lawyer (2012); Ariela J. Gross, Litigating Whiteness: Trials of Racial Determination in the Nineteenth-Century South, 108 Yale L.J. 109, 112, 120–22, 132–51 (1998); Anthony V. Alfieri & Angela Onwuachi-Willig, Next-Generation Civil Rights Lawyers: Race and Representation in the Age of Identity Performance, 122 Yale L.J. 1484, 1492–1501 (2013); Susan D. Carle, Conceptions of Agency in Social Movement Scholarship: Mack on African American Civil Rights Lawyers, 39 L. & Soc. Inquiry 522, 522 (2014). Kenji Yoshino has likewise argued that identity-based discrimination law insufficiently captures how heteronormative structures force queer people to engage in “covering” performances. Kenji Yoshino, Covering: The Hidden Assault on our Civil Rights 17–19 (2006).
    1. See Clare Huntington, Staging the Family, 88 N.Y.U. L. Rev. 589, 592, 619–27 (2013).
    1. .Judith Butler, Notes Towards a Performative Theory of Assembly 63 (2015).
    1. .Maren Wehrle, ‘Bodies (That) Matter’: The Role of Habit Formation for Identity, 20 Phenomenology & Cognitive Sci. 365, 366–67 (2020).
    1. Id. at 371.
    1. .Aristotle, Nicomachean Ethics 1103b1–b5, 1094a20, 1094b6, 1098a15 (Roger Crisp trans., ed., 2000).
    1. .Wehrle, supra note 32, at 380.
    1. See, e.g., David I. Kertzer, Ritual, Politics, and Power 10 (1988) (noting that performances that accord with an identity to which we want to subscribe offer us “confidence that the world in which we live today is the same world we lived in before and the same world we will have.”); Butler, Performative Acts, supra note 25, at 523–25.
    1. .Normalization is cognitive slippage from statistical frequency to moral propriety; it is a process through which common things come to be understood as acceptable, ordinary, and, ultimately, good. See, e.g., Adam Bear & Joshua Knobe, Normality: Part Descriptive, Part Prescriptive, 167 Cognition 25, 26 (2017).
    1. .Adam Bear & Joshua Knobe, The Normalization Trap, N.Y. Times (Jan. 28, 2017), https://www.nytimes.com/2017/01/28/opinion/sunday/the-normalization-trap.html [https://perma.cc/JE9J-AD2P].
    1. See Diane Vaughan, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA 77–195 (1997) (demonstrating how routinized decisions that violated rules and norms came to be normalized as part of engineering and testing work).
    1. See Bear & Knobe, supra note 37, at 29 (finding the perception of the normal amount of television is based on frequency and what is perceived to be ideal).
    1. .Butler, Gender Trouble, supra note 26, at 143–45.
    1. Id. at 124–25.
    1. Id.
    1. .This is closely related to actor-network theory (ANT) in sociological research. ANT posits that the development of knowledge should be understood by analyzing how individuals and groups interact, because the social and natural world is a “continuously generated effect of the webs of relations.” John Law, Actor Network Theory and Material Semiotics, in The New Blackwell Companion to Social Theory 141 (Bryan S. Turner ed., 2009); see also Bruno Latour, Reassembling the Social: An Introduction to Actor-Network-Theory 57 (2005) (stating that social scientists wanting to propose alternative metaphysics must “first engage in the world-making activities of those they study”). But see Judy Wajcman, Reflections on Gender and Technology Studies: In What State Is the Art?, 30 Soc. Stud. Sci. 447, 452 (2000) (criticizing ANT and other Science and Technology Studies theories for ignoring the contributions of marginalized populations, particularly women, in the development of new technology).
    1. See Goffman, supra note 14, at 13–19; Butler, supra note 16, at 2.
    1. .Butler, supra note 16, at 15.
    1. .Lauren B. Edelman, Working Law 12, 22 (John M. Conley & Lynn Mather eds., 2016).
    1. See generally Title VII of the Civil Rights Act of 1964, 42 U.S.C. § 2000e–2000e-17.
    1. .Edelman, supra note 47, at 30–33.
    1. Id. at 33–38. But see id., at 6–10 (providing statistical evidence of continued racial and gender inequality in the workplace).
    1. Id. at 38, 173.
    1. .Scholars have also relied on the performativity thesis to broaden our understanding of the value of privacy. See, e.g., Julie E. Cohen, Privacy, Visibility, Transparency, and Exposure, 75 U. Chi. L. Rev. 181, 192–93 (2008) (arguing that privacy captures interests far beyond the unwanted disclosure of personal information because our actions express and define our identities); Scott Skinner-Thompson, Performative Privacy, 50 U.C. Davis L. Rev. 1673, 1697–1708 (2017) (arguing that privacy-enhancing behaviors, such as wearing hoodies in the physical world and using obfuscating technology online, perform expressive resistance to a surveillance society).
    1. .Butler, supra note 26, at 142–45.
    1. See Danielle Keats Citron, The Privacy Policymaking of State Attorneys General, 92 Notre Dame L. Rev. 747, 760 (2016) [hereinafter Citron, Privacy Policymaking] (focusing on state AGs); Daniel J. Solove & Woodrow Hartzog, The FTC and the New Common Law of Privacy, 114 Colum. L. Rev. 583, 583 (2014) (focusing on the FTC). Notably, the Consumer Financial Protection Bureau, particularly under the leadership of Director Rohit Chopra, has taken a greater interest in protecting consumers from predatory data practices in the financial sector.
    1. .There are indications that this is changing, especially under the leadership of FTC Chair Lina Khan, who, after her appointment to the post by President Biden, has arguably taken a more aggressive posture toward corporate accountability. See, e.g., Russell Brandom, Federal Trade Commission Expands Antitrust Powers in Chair Lina Khan’s First Open Proceeding, Verge (July 1, 2021), https://www.theverge.com/2021/7/1/22559131/ftc-open-meeting-antitrust-chair-lina-khan-sherman-act-powers [https://perma.cc/P9JX-4PTW].
    1. See Citron, Privacy Policymaking, supra note 54, at 763–64; Solove & Hartzog, supra note 54, at 598–99.
    1. .Fed. Trade Comm’n, FTC Staff Report: Self-Regulatory Principles for Online Behavioral Advertising 48 (2009), https://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-staff-report-self-regulatory-principles-online-behavioral-advertising/p085400behavadreport.pdf [https://perma.cc/436G-JU9W].
    1. E.g., Fed. Trade Comm’n, Online Profiling: A Report to Congress (2000), https://www.ftc.gov/sites/default/files/documents/reports/online-profiling-federal-trade-commission-report-congress-part-2/onlineprofilingreportjune2000.pdf [https://perma.cc/KP3C-PSHN]; Fed. Trade Comm’n, Online Profiling: A Report to Congress, Part 2 Recommendations (2000), https://www.steptoe.com/a/web/564/934.pdf [https://perma.cc/6Y3V-ZN6K]; Fed. Trade Comm’n, Repairing a Broken System: Protecting Consumers in Debt Collection Litigation and Arbitration ii (2010), https://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-bureau-consumer-protection-staff-report-repairing-broken-system-protecting/debtcollectionreport.pdf [https://perma.cc/54ZW-78HT]; see also Mozelle W. Thompson, The Challenges of Law in Cyberspace—Fostering the Growth and Safety of E-Commerce, 6 B.U. J. Sci. & Tech. L. 9, ¶¶ 1, 8–10 (1999) (discussing the FTC’s interactions with industry leaders and how it views its role).
    1. .Citron, supra note 54, at 759.
    1. Id. at 760.
    1. Id. at 759–60. Professor Citron’s research makes clear that regulators engaged in these practices by 2012. Given that they were explaining their offices’ practices by 2012, it is reasonable to assume that some of these offices were working directly with industry before then.
    1. .FTC Staff Issues Privacy Report, Offers Framework for Consumers, Businesses, and Policymakers, Fed. Trade Comm’n (Dec. 1, 2010), https://www.ftc.gov/news-events/press-releases/2010/12/ftc-staff-issues-privacy-report-offers-framework-consumers [https://perma.cc/6JEK-KLXH] (quoting FTC Chair Leibowitz as including “promot[ing] . . . business innovation” as one of the goals of the report and the FTC).
    1. .Univ. of Wash., Privacy Redress Options Workshop, Cal. Emp. Laws. Ass’n (Dec. 10, 2020), https://medius.studios.ms/Embed/video-nc/CELAReadress-2020 [https://perma.cc/NU68-TK8T] (featuring comments by Julie Brill).
    1. .Facilitating innovation is built into the FTC’s work. See, e.g., Fed. Trade Comm’n, Protecting Consumer Privacy in an Era of Rapid Change iv, 1 (2010) (noting that the goal of several FTC convenings was to develop a framework for privacy that facilitates technological innovation); Fed. Trade Comm’n, To Promote Innovation: The Proper Balance of Competition and Patent Law and Policy (2003), http://www.ftc.gov/os/2003/10/innovationrpt.pdf [https://perma.cc/QE68-6FN3] (proposing policy decisions based on what would facilitate innovation and the development of new technologies).
    1. See Citron, supra note 54, at 760.
    1. .Ajit Pai, The Future of Internet Freedom, Fed. Trade Comm’n (Apr. 26, 2017), https://transition.fcc.gov/Daily_Releases/Daily_Business/2017/db0427/DOC-344590A1.pdf [https://perma.cc/5WPC-NVJ4]; see also Cohen, supra note 9, at 187 (quoting Jodi L. Short, The Paranoid Style in Regulatory Reform, 63 Hastings L.J. 633, 635 (2012) (referring to the “paranoid” style of regulation)).
    1. .Citron, supra note 54, at 760.
    1. .SAFE DATA Act § 206(d)(3)(D).
    1. See id. at § 404(a).
    1. .Privacy Bill of Rights § 13(a)(3).
    1. .COPRA § 107(c).
    1. .About NIST, Nat’l Inst. Standards Tech.: U.S. Dep’t of Com. (Jan. 11, 2022), https://www.nist.gov/about-nist/our-organization/mission-vision-values [https://perma.cc/K7KX-9XQU].
    1. .Citron, supra note 54, at 761.
    1. .CCPA § 1798.185(a).
    1. Id. at 1798.185(a)(7); see, e.g., H.R. 784, 2020 Gen. Assemb., 441st Sess. § 14-4211(7) (Md. 2020).
    1. See Solove & Hartzog, supra note 54, at 606, 610.
    1. .Citron, supra note 54, at 761 (“States . . . often eschew formal adjudication for informal agreements that close investigations”).
    1. .United States v. Armour & Co., 402 U.S. 673, 681 (1971) (“Consent decrees are entered into by parties to a case after careful negotiation has produced agreement on their precise terms.”); United States v. ITT Cont’l Baking Co., 420 U.S. 223, 238 (1975) (“[A] consent decree . . . is to be construed for enforcement purposes basically as a contract.”).
    1. .These numbers were based on a search on the FTC’s website, which categorizes all of its enforcement actions. See Legal Library: Cases and Proceedings, Fed. Trade Comm’n, https://www.ftc.gov/enforcement/cases-proceedings/advanced-search [https://perma.cc/SB3X-WKXH] (searching for “privacy and security” cases under “Consumer Protection Topics”). The number of total FTC privacy cases is far higher than 271, which does not capture the many investigations that are dropped or end with negotiations before the complaint stage. The three litigated cases include: Fed. Trade Comm’n v. Accusearch, Inc., 570 F.3d 1187, 1193 (10th Cir. 2009) (endorsing the FTC’s power to bring cases under its “unfair or deceptive” practices authority); Fed. Trade Comm’n v. Wyndham Worldwide Corp., 10 F. Supp. 3d 602, 602 (D. N.J. 2014) (finding that the FTC properly pled and had authority to regulate defendants’ failure to maintain reasonable and appropriate data security); LabMD Inc. v. Fed. Trade Comm’n, 894 F.3d 1221, 1229 (11th Cir. 2018) (limiting the FTC’s power to require companies to take “reasonable” security measures).
    1. .Solove & Hartzog, supra note 54, at 606, 610.
    1. Id. at 608–19.
    1. .Citron, supra note 54, at 761.
    1. See id. at 761–63.
    1. .Solove & Hartzog, supra note 54, at 585.
    1. .Law has expressive value that influences public perceptions of what is right and wrong. See, e.g., Danielle Keats Citron, Law’s Expressive Value in Combating Cyber Gender Harassment, 108 Mich. L. Rev. 373 (2009); Elizabeth S. Scott, Social Norms and the Legal Regulation of Marriage, 86 Va. L. Rev. 1901 (2000); Cass R. Sunstein, On the Expressive Function of Law, 144 U. Pa. L. Rev. 2021, 2022 (1996).
    1. See, e.g., Fed. Trade Comm’n, Office Comm’n Rohit Chopra, Dissenting Statement by Commissioner Rohit Chopra: In re Facebook, Inc., Commission File No. 1823109 (July 24, 2019), https://www.ftc.gov/system/files/documents/public_statements/1536911/chopra_dissenting_statement_on_facebook_7-24-19.pdf [https://perma.cc/F46U-2C48].
    1. .Consider, for example, the FTC’s media campaign when it sued Facebook for “illegally maintaining its personal social networking monopoly through a years-long course of anticompetitive conduct.” See Fed. Trade Comm’n, FTC Sues Facebook for Illegal Monopolization (Dec. 9, 2020), https://www.ftc.gov/news-events/press-releases/2020/12/ftc-sues-facebook-illegal-monopolization [https://perma.cc/E73G-2JBG]. Press associated with the lawsuit has quoted FTC staff and several of the 48 AGs that joined the lawsuit. See, e.g., Cecilia Kang & Mike Isaac, U.S. and States Say Facebook Illegally Crushed Competition, N.Y. Times (Dec. 9, 2020), https://www.nytimes.com/2020/12/09/technology/facebook-antitrust-monopoly.html [https://perma.cc/7Y3V-DVG3] (quoting New York AG Letitia James and Ian Conner, the head of antitrust enforcement at the FTC); Tony Romm, U.S., States Sue Facebook as an Illegal Monopoly, Setting Stage for Potential Breakup, Wash. Post (Dec. 9, 2020), https://www.washingtonpost.com/technology/2020/12/09/facebook-antitrust-lawsuit/ [https://perma.cc/XNW7-3HND] (quoting FTC Chairperson Joe Simons and AG James).
    1. See Citron, supra note 54, at 750.
    1. Id. at 806.
    1. .Solove & Hartzog, supra note 54, at 619.
    1. .Citron, supra note 54, at 761.
    1. .Minnesota H.F. 1492 § 325O.10 (empowering the AG to bring an enforcement action in accordance with Minn. Stat. Ann. § 8.31, which, in subdivision 2a, explicitly allows the AG to rely on an “assurance of discontinuance of any act or practice the attorney general deems to be in violation of the laws”).
    1. .CCPA § 1798.155(c).
    1. .Hawaii S.B. § 487J-5; see also Office of Consumer Protection Blog, Haw. Dep’t of Com. & Consumer Affs., http://cca.hawaii.gov/blog/category/divisions/ocp/ [https://perma.cc/J6H6-U47G] (reporting only settlements with investigated companies).
    1. .Admin. Conf. of the U.S., Agency Assessment and Mitigation of Civil Money Penalties 2 (1979).
    1. .Dean Acheson, Francis Biddle, Ralph F. Fuchs, Lloyd K. Garrison, D. Lawrence Groner, Henry M. Hart, Carl McFarland, James W. Morris, Harry Shulman, E. Blythe Statson, Arthur T. Vanderbilt & Walter Gellhorn, Final Report of the Attorney General’s Committee on Administrative Procedure 35 (1941).
    1. .Solove & Hartzog, supra note 54, at 619–27; Citron, supra note 54, at 758–63.
    1. See infra Part II.B for a more detailed discussion of the performativity of compliance practices.
    1. .Google Consent Decree, supra note 12, at 4.
    1. .See Solove & Hartzog, supra note 54, at 617–18.
    1. .Press Release, Off. of the Att’y Gen. of Conn., Attorney General Announces $7 Million Multistate Settlement with Google over Street View Collection of WiFi Data (Mar. 12, 2013), https://portal.ct.gov/AG/Press-Releases-Archived/2013-Press-Releases/Attorney-General-Announces-7-Million-Multistate-Settlement-With-Google-Over-Street-View-Collection-o [https://perma.cc/HUL3-L3ZX].
    1. .Notice of Dismissal by Agreement, No. 10CH44962 (Ill. Cir. Ct. Oct. 3, 2012) (requiring employee training and adoption of new internal policies).
    1. .Order Granting Permanent Injunction, No. 2010-CI-13625 (Tex. Dist. Ct. Aug. 26, 2010) (agreeing that the company would adopt comprehensive security program).
    1. .Citron, supra note 54, at 781 (citing Assurance of Voluntary Compliance, In re Health Net, No. 10-040 (Office of the Att’y Gen. N.Y. Aug. 2, 2010)) (agreeing to trainings, audits, and comprehensive programs).
    1. .For a comprehensive discussion of state AGs entering into AVCs with information industry companies, please see Citron, supra note 54, at 761–62, 769–71, 776, 781, 806–09.
    1. .Solove & Hartzog, supra note 54, at 618.
    1. See, e.g., Decision and Order, Facebook, Inc., 154 F.T.C. 1 (2012) (requiring Facebook to hire a third-party auditor).
    1. .Chris Jay Hoofnagle, Assessing the Federal Trade Commission’s Privacy Assessments, 14 IEEE Sec. & Priv. 58, 61 (2016).
    1. .This standard script includes the following questions: “Have they appointed someone responsible for looking at privacy? Are they doing risk assessments? Have they trained employees? Are they doing continual testing to make sure they’re closing loopholes? Do they have service providers that handle consumer data; do they specify privacy protections in the contracts with them?” Kashmir Hill, So, What Are These Privacy Audits that Google and Facebook Have to Do for the Next Twenty Years, Forbes (Nov. 30, 2011), https://www.forbes.com/sites/kashmirhill/2011/11/30/so-what-are-these-privacy-audits-that-google-and-facebook-have-to-do-for-the-next-20-years/#3bbf76805000 [https://perma.cc/Q3XM-RWJP].
    1. .Megan Gray, Understanding and Improving Privacy “Audits” Under FTC Orders, Stan. L. Sch. Ctr. Internet & Soc’y 6 (Apr. 18, 2018), https://cyberlaw.stanford.edu/files/blogs/white%20paper%204.18.18.pdf [https://perma.cc/A4KH-C5P7].
    1. Id. at 6 n.15; see also EPIC FOIA Uncovers Google’s Privacy Assessment, Elec. Priv. Info. Ctr. (Sept. 28, 2012), https://epic.org/2012/09/epic-foia-uncovers-googles-pri.html [https://perma.cc/6BKU-9AU2].
    1. .Evan Schuman, Uber Shows How Not to Do a Privacy Report, Comput. World (Feb. 5, 2015), https://www.computerworld.com/article/2880596/uber-shows-how-not-to-do-a-privacy-report.html [https://perma.cc/EH23-JNMP].
    1. .Jessica Leber, The FTC’s Privacy Cop Cracks Down, MIT Tech. Rev. (June 26, 2012), https://www.technologyreview.com/s/428342/the-ftcs-privacy-cop-cracks-down/ [https://perma.cc/28YZ-GDT6].
    1. E.g., Letters to Commenters, In re Facebook, Inc., FTC File No. 092 3184, https://www.ftc.gov/sites/default/files/documents/cases/2012/08/120810facebookcmbltrs.pdf [https://perma.cc/79GS-YL24] (Argentar letter).
    1. See Solove & Hartzog, supra note 54, at 618.
    1. .H.R. 3936, 91st Leg., 91st Sess. (Minn. 2020) § 25O.04(d)(3).
    1. .COPRA § 108(b)(2).
    1. .MYOBA § 5(a)(1)–(b)(1).
    1. .Privacy Bill of Rights § 13(b)(3)(A)–(B).
    1. See Martha K. Landesberg & Laura Mazzarella, Fed. Trade Comm’n, Self-Regulation and Privacy Online: A Report to Congress 12–14 (1999); Consumer Privacy on the World Wide Web: Hearing Before the H. Subcomm. on Telecom’s, Trade, & Consumer Protection of the H. Comm. on Commerce, 105th Cong., 2d Sess. (1998) (prepared statement of the Fed. Trade Comm’n by Robert Pitosfky, Chairperson) (advocating for self-regulation); Self-Regulation and Privacy Online: Hearing before the S. Subcomm. on Comm’s of the Comm. on Com., Sci., & Transp., 106th Cong., 1st Sess. 4 (1999) (prepared statement of the Fed. Trade Comm’n by Robert Pitofsky, Chairperson) (also advocating for self-regulation).
    1. .Solove & Hartzog, supra note 54, at 590–95.
    1. E.g., Cal. Bus. & Prof. Code § 22575 (West 2020); Children’s Online Privacy Protection Act, 15 U.S.C. § 6502(b)(1)(A)(i) (requiring websites geared toward children to disclose what data they collect—whether obtained actively or passively—how it will be used, whether it will be shared, and how to delete or opt out of data collection); Gramm-Leach-Bliley Act, 15 U.S.C. § 6803(a)(1)–(2); 16 C.F.R. § 313.6(a)(3), (6) (imposing similar requirements on certain financial institutions).
    1. .Edelman, supra note 47, at 25.
    1. Id. at 25–26; see Cohen, supra note 9, at 144–45.
    1. .Kaminski, supra note 4, at 1559. For a more comprehensive definition of collaborative governance, see Jody Freeman, Collaborative Governance in the Administrative State, 45 UCLA L. Rev. 1, 21–33 (1997); Orly Lobel, New Governance as Regulatory Governance, in The Oxford Handbook of Governance 65–67 (David Levi-Faur ed., 2012); Orly Lobel, The Renew Deal: The Fall of Regulation and the Rise of Governance in Contemporary Legal Thought, 89 Minn. L. Rev. 342, 371–76 (2004).
    1. .Kaminski, supra note 4, at 1560. Other scholars have argued in favor of collaborative governance approaches to privacy law. See, e.g., Kenneth A. Bamberger & Deirdre K. Mulligan, Privacy On the Ground 12–13 (2015) (suggesting that the public-private partnerships created by privacy law provide space for CPO innovation); Dennis D. Hirsch, Going Dutch? Collaborative Dutch Privacy Regulation and the Lessons It Holds for U.S. Privacy Law, 2103 Mich. St. L. Rev. 83, 99–102; W. Nicholson Price II, Regulating Black-Box Medicine, 116 Mich. L. Rev. 421, 465–71 (2017).
    1. .Kaminski, supra note 4, at 1557–58. See also Lilian Edwards & Michael Veale, Slave to the Algorithm? Why a ‘Right to an Explanation’ Is Probably Not the Remedy You Are Looking for, 16 Duke L. & Tech. Rev. 18, 74–75 (2017) (noting how individuals lack the technical skill to identify information economy abuses and are limited by cognitive biases that make exercising rights difficult).
    1. .Kaminski, supra note 4, at 1561.
    1. See id. at 1561–62.
    1. Id. at 1564–66.
    1. Id. at 1567.
    1. See Bamberger & Mulligan, supra note 126, at 12–13.
    1. .Kaminski, supra note 4, at 1562.
    1. See id.
    1. Id. at 1577.
    1. Id. at 1579.
    1. .Bamberger & Mulligan, supra note 126, at 65; Kenneth A. Bamberger & Deirdre K. Mulligan, Privacy on the Books and on the Ground, 63 Stan. L. Rev. 247, 269–70 (2011).
    1. .Bamberger & Mulligan, supra note 137, at 265, 270; see Directive 95/46, of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, 1995 O.J. (L 281) 40 [hereinafter Privacy Directive].
    1. .Bamberger & Mulligan, supra note 137, at 271–72; see Kenneth Bamberger & Deirdre Mulligan, Catalyzing Privacy: New Governance, Information Practices, and the Business Organization, 33 L. & Pol’y (2011).
    1. .Bamberger & Mulligan, supra note 137, at 261–63.
    1. Id. at 260–63.
    1. Id. at 263. There is evidence, however, that this work did not have any material effect on data-extractive design. See Ari Ezra Waldman, Designing Without Privacy, 55 Houston L. Rev. 659, 678–701 (2018).
    1. .Google Consent Decree, supra note 12.
    1. .Solove & Hartzog, supra note 54, at 617–18, 673.
    1. Id. at 618–19; Citron, supra note 54, at 761–62.
    1. E.g., CCPA § 1798.135(a)(3) (training); H.R. 5603, 101st Gen. Assemb., Reg. Sess. § 40(6) (Ill. 2020) (training); H.R. 784, 2020 Gen. Assemb., 441st Sess. § 14-4204(E) (Md. 2020) (training); H.R. 3936, 91st Leg., 91st Sess. § 14-4204(E) (Minn. 2020) § 325O.04(b)(1) (organizational measures to assist in compliance); H.R. 3936 § 325O.04(d)(3) (conduct audits of processors); S. 176, 54th Leg., 1st Sess. § 6 (N.M. 2019) (training); H.R. 4390, 86th Leg., Reg. Sess. § 541.053 (Tex. 2019) (data security program); H.R. 4390 § 541.058 (privacy accountability program to assess risk); COPRA § 107(b)(4) (training), § 201 (internal privacy program with certification by executives), § 202(a)–(b)(1) (privacy and security officers), § 202(b) (comprehensive privacy program), § 202(b)(2) (annual assessments of program); MYOBA §§ 6(a)(7) (biennial review), 7(b)(1)(A)–(B) (establish reasonable privacy policies and internal organizational technical measures), § 7(b)(C) (designating privacy coordinators); Privacy Bill of Rights §§ 13(a)(1) (internal practices to ensure confidentiality of information), 13(b)(3) (audits of privacy programs in place), 14 (designate privacy and security officer); SAFE DATA Act §§ 204(a) (establishing “reasonable administrative” measures), 301(a) (designating a CPO and other responsible employees).
    1. .Provisions requiring trainings include the following: S. 1614, 54th Leg., 2d Reg. Sess. § 18-701(L)(5) (Ariz. 2020); CCPA § 1798.135(a)(3); S. 418, 30th Leg., Reg. Sess. § 487(J)–(H)(6) (Haw. 2019); H.R. 5603 § 40(6); H.R. 784 § 14-4204(E); H.R. 1656, 2020 Gen. Assemb., 441st Sess. §§ 1656, 14–4204(E) (Md. 2020); S. 176 § 6; COPRA § 107(b)(4). Statutory provisions requiring record-keeping include the following: S. 418 § 487(J)–(H) (requiring lists); MYOBA § 6(a)(2); Online Privacy Act § 202(b). Statutes requiring PIAs include: S. 2263, 101st Gen. Assemb., 1st Reg. Sess. § 30 (Ill. 2019); S. 2330, 101st Gen. Assemb., 1st Reg. Sess. § 35(l) (Ill. 2020); H.R.3936 § 325O.08; H.R. 4390 § 541.058 (accountability program to assess risk); H.R. 473, 2020 Gen. Assemb., 2020 Sess. § 59.1-576 (Va. 2020); S. 5062, 67th Leg., 2021 Reg. Sess. § 109 (Wash. 2021); MYOBA § 7(b)(G)–(H); SAFE DATA § 107(a)(1), (b); see also Kaminski, supra note 4, at 1603–05 (noting that PIAs are internal documents meant to help balance risks and benefits and intended to keep privacy front of mind during design).
    1. See COPRA § 202(a)(1)–(2); MYOBA § 7(b)(C); Privacy Bill of Rights § 14; SAFE DATA Act § 301(a)–(b).
    1. See, e.g., H.R. 3936 § 325O.04(b)(1) (processor required to have organizational measures to assist data controller with compliance); COPRA §§ 201, 202(b) (comprehensive privacy program and internal reporting structure ensuring that privacy professionals are involved and responsible for compliance); MYOBA §§ 6(a)(7) (biennial review of information provided to consumers for exercising opt out requests), 7(b)(A)–(B); Privacy Bill of Rights § 13(a)(1).
    1. .H.R. 3936 § 325O.04(d)(3); COPRA § 202(b)(2); MYOBA § 5(a)(1); SAFE DATA Act § 204(a).
    1. .H.R. 4390 § 541.059; COPRA § 203(c)(1)(A)–(B); Data Care Act § 3(b)(3)(C); MYOBA § 6(a)(8); Privacy Bill of Rights § 10.
    1. .COPRA § 201; MYOBA § 5(a)(1).
    1. .MYOBA § 6(a)(10).
    1. .H.R. 3936 § 325O.05 subdiv. 3 (requiring internal appeals process); H.R. 3936 § 325O.085(a) (independent tests of facial recognition); COPRA § 108(b); MYOBA § 7(b)(G); Privacy Bill of Rights § 13(b)(3); SAFE DATA Act § 206(b)(4).
    1. .MYOBA § 6(a)(4).
    1. .SAFE DATA Act §§ 206(c)(3), 404(a).
    1. .Daniel J. Solove, Introduction: Privacy Self-Management and the Consent Dilemma, 126 Harv. L. Rev. 1879, 1880 (2013). There is a robust literature demonstrating the centrality of control discourse and practices in privacy law. For summaries of that literature, see, for example, Hartzog, supra note 13, at 959 (explaining how “control” won out as the focus of the FIPS and privacy law); Ari Ezra Waldman, Privacy As Trust 29–33 (2018) (summarizing the privacy scholarly literature on control).
    1. .Daniel J. Solove, The Myth of the Privacy Paradox, 89 Geo. Wash. L. Rev. 1, 3 (2021).
    1. .Gordon Hull, Successful Failure: What Foucault Can Teach Us About Privacy Self-Management in a World of Facebook and Big Data, 17 Ethics & Info. Tech. 89, 89 (2015).
    1. See Maggie Oates, Yama Ahmadullah, Abigail Marsh, Chelse Swoopes, Shikun Zhang, Rebecca Balebako & Lorrie Faith Cranor, Turtles, Locks, and Bathrooms: Understanding Mental Models of Privacy Through Illustration, 2018 Proc. Priv. Enhancing Tech. 5, 5 (2018); Christina Nippert-Eng, Islands of Privacy 7 (2010).
    1. Facebook, Social Media Privacy, and the Use and Abuse of Data, Hearing before the S. Subcomm. on Com., Sci., & Transp. of the Comm. on the Judiciary, 115th Cong., 2d Sess. (2018) (oral statement of Mark Zuckerberg, CEO, Facebook, Inc.) [hereinafter Facebook Hearing].
    1. Online Platforms and Market Power Part 6: Examining the Dominance of Amazon, Apple, Facebook, and Google: Hearing before the H. Comm. on the Judiciary, Subcomm. on Antitrust, Com., & Admin. L, 116th Cong (2020) [hereinafter Online Platforms Hearing] https://judiciary.house.gov/calendar/eventsingle.aspx?EventID=3113 [https://perma.cc/7M7S-7BTV] (written testimony of Mark Zuckerberg, CEO, Facebook, Inc.)
    1. Policy Principles for a Federal Data Privacy Framework in the United States: Hearing before S. Comm. on Com., Sci., & Transp., 116th Cong., 2d Sess. (2019) [hereinafter Policy Principles Hearing], https://www.commerce.senate.gov/2019/2/policy-principles-for-a-federal-data-privacy-framework-in-the-united-states [https://perma.cc/MA2H-RHZR] (oral testimony of Jon Leibowitz at 45:00); see also Brendan Sasso & National Journal, The ‘Privacy Coalition’ That Wants to Trim Data Regulation for Telecom Giants, Atlantic (May 11, 2015), https://www.theatlantic.com/politics/archive/2015/05/the-privacy-coalition-that-wants-to-trim-data-regulations-for-telecom-giants/456477/ [https://perma.cc/KH7F-PCZA] (describing Mr. Leibowitz’s positions as reflecting the deregulatory interests of an advocacy group for telecommunications companies).
    1. .Policy Principles Hearing, supra note 163 (written testimony of Jon Leibowitz at 4); id. (oral statement of Jon Leibowitz).
    1. Id. (oral testimony of Michael Beckerman at 47:55); see Internet Ass’n, https://internetassociation.org/our-members/ [https://perma.cc/3PTN-593L].
    1. .Policy Principles Hearing, supra note 163 (written testimony of Michael Beckerman at 1, 4).
    1. Examining Safeguards for Consumer Data Privacy: Hearing before S. Comm. on Com., Sci. & Transp., 115th Cong., 2d Sess. (2017), https://www.commerce.senate.gov/2018/9/examining-safeguards-for-consumer-data-privacy [https://perma.cc/VJM3-RHVW ] (oral statement of Bud Tribble at 55:53); id. (oral statement of Rachel Welch at 1:00:07).
    1. .Online Platforms Hearing, supra note 162 (oral statement of Sundar Pinchai, CEO of Alphabet, Inc, at 4:45:50.); id. (oral statement of Sundar Pinchai at 37:58).
    1. Small Business Perspectives on a Federal Data Privacy Framework: Hearing before S. Subcomm. on Mfg., Trade & Consumer Prot. of the Comm. on Com., Sci., & Transp., 116th Cong., 1st Sess. (2019) [hereinafter Small Business Hearing], https://www.commerce.senate.gov/2019/3/small-business-perspectives-on-a-federal-data-privacy-framework [https://perma.cc/7GNC-69PV] (oral testimony of Evan Engstrom, Executive Director of the Engine Advocacy and Research Foundation at 39:49); see David Dayen, An Advocacy Group for Startups Is Funded by Google and Run by Ex-Googlers, Intercept (May 30, 2018), https://theintercept.com/2018/05/30/google-engine-advocacy-tech-startups/ [https://perma.cc/9AR2-5NEH].
    1. .Small Business Hearing, supra note 169 (oral testimony of Nin Dosanjh, Vice Chair, Technology Policy Committee, National Association of Realtors, at 51:28).
    1. Examining Safeguards for Consumer Data Privacy: Hearing before S. Comm. on Com., Sci. & Transp., 115th Cong., 2d Sess. (2018), https://www.commerce.senate.gov/2018/9/examining-safeguards-for-consumer-data-privacy [https://perma.cc/3B3X-M7ZP] (oral testimony of Keith Enright).
    1. Id. (oral testimony of Damien Kieran, Global Data Protection Officer and Associate General Counsel, Twitter, Inc., at 50:45).
    1. See Oates et al., supra note 160, at 5.
    1. .Nippert-Eng, supra note 160, at 7.
    1. .Woodrow Hartzog & Daniel J. Solove, The Scope and Potential of FTC Data Protection, 83 Geo. Wash. L. Rev. 2230, 2235 (2015).
    1. .Hull, supra note 159, at 90.
    1. Id. at 96.
    1. Id. at 97.
    1. .2 Michel Foucault, The History of Sexuality: The Use of Pleasure 28 (Robert Hurley trans., 1985).
    1. .Twenty-five laws guarantee a right of access. CCPA §§ 1798.100(d), 1798.110, 1798.115; S. 1614, 54th Leg., 2d Reg. Sess. § 18-701(A)­(D) (Ariz. 2020); S. 418, 30th Leg., Reg. Sess. § 487(J)–(C) (Haw. 2019); S. 2263, 101st Gen. Assemb., 1st Reg. Sess. § 20(1) (Ill. 2019); S. 2330, 101st Gen. Assemb., 1st Reg. Sess. § 20 (Ill. 2020); H.R. 5603 §§ 20, 25; H.R. 784, 2020 Gen. Assemb., 441st Sess. § 14-4203 (Md. 2020); H.R. 1656 § 14-4203; H.R. 3936, 91st Leg., 91st Sess. § 325O.05, subd. 1(1) (Minn. 2020); H.R. 1253, 2019 Leg., 2019 Reg. Sess. §§ 3(1), 5, 6 (Miss. 2019); L. 746, 106th Leg., 2d Reg. Sess. §§ 6, 8 (Neb. 2020); Assemb. 3255, 219th Leg., 1st Ann. Sess. § 2(I)(e) (N.J. 2020); S. 2834, 218th Leg., 1st Ann. Sess. § 3 (N.J. 2018); S. 176, 54th Leg., 1st Sess. § 3(a) (N.M. 2019); S. 5642, 2019 Leg.,Reg. Sess. § 1103(1), (5) (N.Y. 2019); H.R. 1049, 203d Gen. Assemb., 2019 Sess. § 4(a)(1)–(2), (b) (Pa. 2019); S. 234, 2019 Gen. Assemb., Jan. Sess. §§ 6-48.1 to 6-48.3(a), 6-48.1-6 (R.I. 2019); H.R. 4518, 86th Leg., Reg. Sess. § 541.053 (Tex. 2019); H.R. 473 § 59.1-574, S. 5062, 67th Leg., 2021 Reg. Sess. § 103(1) (Wash. 2021); COPRA § 102(a); Privacy Bill of Rights § 6(a)(1); SAFE DATA Act § 103(a); Online Privacy Act § 101; DATA § 201.
    1. .Twenty-five laws guarantee a right to delete. CCPA § 1798.105; S. 1614 § 18-701(E); S. 418 § 487(J)–(D); S. 2263 § 20(3); S. 2330 § 25(3); H.R. 5603 § 15; S. 2351, 88th Gen. Assemb., 2020 Reg. Sess. § 3 (Iowa 2020); H.R. 784 § 14-4205; H.R. 1656 § 14-4205; H.R. 3936 § 325O.05, subd. 1(3); H.R. 1253 § 4(1); L. 746 § 9; Assemb. 3255 § 3; S. § 176 3(b); S. 5642 § 1103(3); H.R. 1049 § 4(e); S. 234 § 6-48.1-4; H.R. 4518 § 541.052; H.R. 473 § 59.1-574; S. 5062, 67th Leg., 2021 Reg. Sess. § 103(3) (Wash. 2021); COPRA § 103; Privacy Bill of Rights § 6(a)(5)(A); SAFE DATA Act § 103(a); Online Privacy Act § 103; DATA § 204.
    1. .Twenty-three laws include a right to opt out. S. 1614 § 18-701(F)–(G); CCPA §§ 1798.120, 1798.135(a)–(b); H.R. 963, 26th Leg., Reg. Sess. § 501.062(2)(b) (Fla. 2020); S. 418 § 487(J)–-(F); S. 2263 § 20(6); S. 2330 § 25(1); H.R. 5603 § 30; H.R. 784 § 14-4206; H.R. 1656 § 14-4206; H.R. 3936 § 325O.05, subd. 1(5); H.R. 1253 § 7; Assemb. 2188, 219th Leg., 2020 Sess. § 4 (N.J. 2020); Assemb. 3255 § 6; S. 2834 § 4; S. 176 §§ 3(d), 4(f); H.R. 1049 § 4(a)(3); S. 234 § 6-48.1-7; H.R. 4518 § 541.054; H.R. 473 § 59.1-574; id. § 59.1-574; S. 5062 § 103(5); COPRA § 105(b); MYOBA § 6; SAFE DATA Act § 104(d).
    1. .S. 2263 § 20(2); S. 2330 § 25(2); H.R. 3936 § 325O.05, subd. 1(2); S. 5642 § 1103(2); H.R. 473 § 59.1-574; S. 5062 § 103(2); COPRA § 104; Privacy Bill of Rights § 6(a)(4); SAFE DATA Act § 103(a); Online Privacy Act § 102; DATA § 203.
    1. .H.R. 3936 § 325O.05, subd. 1(4); H.R. 473 § 59.1-574; S. 5062 § 103(4); COPRA § 105(a); Privacy Bill of Rights § 6(a)(3); SAFE DATA Act § 103(a); DATA § 201.
    1. .S. 2263 § 20(4); H.R. 473 § 59.1-574.
    1. .S. 5642 § 1103(6).
    1. .DATA §§ 205, 206.
    1. .S. 1614, 54th Leg., 2d Reg. Sess. § 18-701(H) (Ariz. 2020); H.R. 2729, 54th Leg., 2d Reg. Sess. §§ 18-574(B), 18-577(G)(3) (Ariz. 2020).
    1. .S. 2263 § 30(3); S. 2330 § 35(l)(3).
    1. .Maine Rev. Stat. Ann. § 9301(3) (2020). The GDPR also allows companies to rely on user consent to data processing, although consent is only one of six lawful bases for justifying data collection and use. GDPR, supra note 3, at art. 6(1).
    1. .U.S. Dep’t of Health, Educ. & Welfare, Records, Computers, and the Rights of Citizens: Report of the Secretary’s Advisory Committee on Automated Personal Data Systems 59–63 (1973), https://aspe.hhs.gov/report/records-computers-and-rights-citizens [https://perma.cc/U3JY-EJW3].
    1. .Bradford, supra note 8, at 3, 22–26.
    1. Id. at 10–19.
    1. Id. at 17–19, 25–26.
    1. Id. at 24–26; see also GDPR, supra note 3, at art. 45, 61–62; Privacy Directive, supra note 138, at art. 25 & recitals 56–57.
    1. .Bradford, supra note 8, at 24–26.
    1. See Chander et al., supra note 4, at 1737–38.
    1. Id.
    1. See, e.g., Horwitz, supra note 9.
    1. .Roscoe Pound, Mechanical Jurisprudence, 8 Colum. L. Rev. 605, 606–07 (1908) (describing the belief in the law’s neutrality as central to legal formalism).
    1. .See generally Polanyi, supra note 9 (providing a canonical account of the social, economic, and legal shifts from a pre-market to an industrial society); Cohen, supra note 9, at 5–8 (offering a similar canonical account of the role of law in the shift to the information age).
    1. .Chander et al., supra note 4, at 1737–38.
    1. .Google Consent Decree, supra note 12, at 6.
    1. .U.S. Dep’t of Health, Educ., & Welfare, supra note 191, at 8–15.
    1. .The E.U. was created in 1993 by the Maastricht Treaty. The E.U.’s Privacy Directive was passed in 1995. See Privacy Directive, supra note 138, at 31.
    1. .Chander et al., supra note 4, at 1737–38.
    1. See supra Part II.B.2.
    1. .Paul M. Schwartz, Global Data Privacy: The EU Way, 94 N.Y.U. L. Rev. 771, 783–85, 787, 794 (2019) (recognizing various approaches to achieving “adequacy”).
    1. .See generally Hartzog, supra note 6.
    1. See, e.g., Susan Rose-Ackerman, Risk Taking and Reelection: Does Federalism Promote Innovation?, 9 J. Legal Stud. 593, 594, 605 (1980); Christopher Serkin, Big Differences for Small Governments: Local Governments and the Takings Clause, 81 N.Y.U. L. Rev. 1624, 1668 (2006).
    1. See Michel Foucault, The Archaeology of Knowledge and The Discourse on Language 201 (A. M. Sheridan Smith trans.,1972) (explaining the way in which discourses shape the way we think and talk about a subject); Michel Foucault, The Order of Discourse, in Untying the Text: A Post-Structuralist Reader 51–52 (Robert Young ed., 1981) (arguing that discourses are used by those in power to maintain power by sustaining discourses that support their control).
    1. .A Brief Explanation of the Overton Window, Mackinac Ctr. for Pub. Pol’y, https://www.mackinac.org/OvertonWindow [https://perma.cc/7XMK-FBH4].
    1. .I am not the first to recognize this, of course. Some argue that privacy impact assessments can encourage companies to analyze how their products impact not just individuals, but groups. See e.g., Margot E. Kaminski & Gianclaudio Malgieri, Algorithmic Impact Assessments Under the GDPR: Producing Multi-Layered Explanations, 11 Int’l Data Priv. L. 125, 138 (2021), https://papers.ssrn.com/sol3/papers.cfm?abstract_id= 3456224 [https://perma.cc/9KYZ-3XKT].
    1. See, e.g., Samuel D. Warren & Louis D. Brandeis, The Right to Privacy, 4 Harv. L. Rev. 193, 196 (1890); Daniel J. Solove, Understanding Privacy 12–59 (2008) (summarizing the literature on different conceptions of privacy).
    1. See Neil Richards, Intellectual Privacy: Rethinking Civil Liberties in the Digital Age 3-9 (2015) (arguing that privacy gives individuals the ability to develop new, inchoate, or dissident ideas); Hartzog, supra note 6, at 161 (describing the ways in which the designs of platform interfaces and other technologies manipulate our choices); Jonathan R. Mayer & John C. Mitchell, Third-Party Web Tracking: Policy and Technology, Proc. 2012 IEEE Symp. on Sec. & Priv. 413, 415 (2012), https://cyberlaw.stanford.edu/files/publication/files/trackingsurvey12.pdf [https://perma.cc/9LAR-NCD3].
    1. .Solove, supra note 214, at 84–88 (arguing that privacy has social value); Waldman, supra note 157, at 49–76 (showing how privacy is necessary for social engagement).
    1. .Robert C. Post, The Social Foundations of Privacy: Community and Self in the Common Law Tort, 77 Calif. L. Rev. 957, 959–68 (1989).
    1. .Julie E. Cohen, Examined Lives: Information Privacy and the Subject as Object, 52 Stan. L. Rev. 1373, 1426–28 (2000).
    1. See, e.g., Frank Pasquale, Two Narratives of Platform Capitalism, 35 Yale L. & Pol’y Rev. 309, 311 (2016) (discussing the legal implications of the ways in which data use entrenches social hierarchies); Frank Pasquale, Paradoxes of Privacy in an Era of Asymmetrical Social Control, in Big Data, Crime and Social Control 31 (Aleš Završnik ed., 2017); Solon Barocas & Andrew D. Selbst, Big Data’s Disparate Impact, 104 Calif. L. Rev. 671, 671 (2016) (demonstrating the capacity for algorithmic decision-making systems to have unequal impact on marginalized populations).
    1. See, e.g., Kashmir Hill, Wrongfully Accused by an Algorithm, N.Y. Times (June 24, 2020), https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html [https://perma.cc/WF8Y-ZTK6]; Julia Angwin, Jeff Larson, Surya Mattu & Lauren Kirchner, Machine Bias, ProPublica (May 23, 2016), https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing [https://perma.cc/QG5S-7596].
    1. See, e.g., Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (2018) (showing the connection between data processing and subordination of the poor); Alvaro M. Bedoya, The Cruel New Era of Data-Driven Deportation, Slate (Sept. 22, 2020), https://slate.com/technology/2020/09/palantir-ice-deportation-immigrant-surveillance-big-data.html [https://perma.cc/CPH8-764Q].
    1. .See Victor Ray, Why So Many Organizations Stay White, Harv. Bus. Rev. (Nov. 19, 2019), https://hbr.org/2019/11/why-so-many-organizations-stay-white?ab=seriesnav-bigidea [https://perma.cc/XNA8-3JGW] (demonstrating that institutions are not “race-neutral” by citing statistics on the scarcity of minority representation in organizational hierarchies).
    1. See Salomé Viljoen, A Relational Theory for Data Governance, 131 Yale L.J. 573, 603–616 (2021).
    1. Id. at 637.
    1. Id. at 573.
    1. Id.
    1. See Daniel Kahneman, Thinking, Fast and Slow (2011); Daniel Kahneman & Amos Tversky, Judgments of and by Representativeness, in Judgment Under Uncertainty: Heuristics and Biases 84–98 (Daniel Kahneman, Paul Slovic & Amos Tversky eds. 1982); Richard H. Thaler & Cass R. Sunstein, Nudge: Improving Decisions About Health, Wealth, and Happiness (2008); Daniel Kahneman, Amos Tversky & Paul Slovic, Judgment Under Uncertainty: Heuristics and Biases, 185 Sci. 1124 (1974). Our disclosure behavior depends on comparative judgments and is skewed by framing biases and hyperbolic discounting, all of which, research shows, make us more likely to disclose information at any given time than exercise our rights to opt out. See Alessandro Acquisti, Leslie K. John & George Loewenstein, The Impact of Relative Standards on the Propensity to Disclose, 49 J. Mktg. Rsch. 160 (2012); Leslie K. John, Alessandro Acquisti & George Loewenstein, Strangers on a Plane: Context-Dependent Willingness to Divulge Sensitive Information, 37 J. Consumer Rsch. 858, 868 (2011). Plus, the platforms through which we must exercise our rights of control are designed for us by the very companies that rely on data-extractive business models. Their websites are laden with manipulative dark patterns and other design tricks that skew and nudge our behaviors in ways that benefit them. Arunesh Mathur, Gunes Acar, Michael J. Friedman, Elena Lucherini, Jonathan Mayer, Marshini Chetty & Arvind Narayanan, Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites, 3 Proc. Ass’n Computing Mach. on Human-Comput. Interaction 1, 2–3 (2019). The rights/compliance model may assume and entrench the idea that we are capable of exercising our rights, but the data tells us otherwise.
    1. See Neil Richards & Woodrow Hartzog, The Pathologies of Digital Consent, 96 Wash. U. L. Rev. 1461, 1463 (2019) (“Consent is the foundation of the relationships we have with search engines, social networks, commercial web sites, and any one of the dozens of other digitally mediated businesses we interact with regularly.”).
    1. E.g., Meg Leta Jones & Margot E. Kaminski, An American’s Guide to the GDPR, 98 Denv. L. Rev. 93, 109 (2021); Gabriela Zanfir-Fortuna, 10 Reasons Why the GDPR Is the Opposite of a ‘Notice and Consent’ Type of Law, Medium (Mar. 13, 2019), https://medium.com/@gzf/10-reasons-why-the-gdpr-is-the-opposite-of-a-notice-and-consent-type-of-law-ba9dd895a0f1 [https://perma.cc/7WJK-2LLY].
    1. .Jones & Kaminski, supra note 229, at 108–10.
    1. Id. at 109.
    1. .Solove, supra note 157, at 1880 (“Privacy self-management takes refuge in consent. [It] . . . legitimizes nearly any form of collection, use, or disclosure of personal data.”).
    1. .Transcript of Record at 7, In re Facebook, Inc. Consumer Privacy User Profile Litigation, No. 18-MD-02843 (N.D. Cal. May 29, 2019).
    1. Id.
    1. Id. at 15.
    1. .Reply in Support of Defendant Facebook, Inc.’s Motion to Dismiss Plaintiffs’ First Amended Consolidated Complaint, In re Facebook, Inc. Consumer Privacy User Profile Litigation, 402 F. Supp 767, 792 (N.D. Cal. 2019).
    1. .Campbell v. Facebook, Inc., 951 F.3d 1106 (9th Cir. 2020).
    1. Id. at 1119 n.9.
    1. .Smith v. Facebook, Inc., 745 F. App'x 8 (9th Cir. 2018).
    1. .Appellee’s Brief at 21, Smith v. Facebook, Inc., 745 F. App'x 8 (9th Cir. 2018) (No. 17-16206), https://epic.org/amicus/facebook/smith/Smith-v-Facebook-9th-Cir-Facebook-Brief.pdf [https://perma.cc/ZA8R-SAEC].
    1. .In re Google, Inc. Cookie Placement Consumer Privacy Litigation, 806 F.3d 125 (3d Cir. 2015).
    1. .Answering Brief of Defendant-Appellee Google Inc. at 36–37, In re Google, Inc. Cookie Placement Consumer Privacy Litigation, 806 F.3d 125 (3d Cir. 2015) (No. 13-4300).
    1. .Patel v. Facebook, Inc., 932 F.3d 1264 (9th Cir. 2019).
    1. .Appellant’s Brief at 33, Patel v. Facebook, 932 F.3d 1264 (9th Cir. 2019) (No. 18-15982).
    1. .Facebook, Inc.’s Motion for Summary Judgment, In re Facebook Biometric Information Privacy Litigation, 326 F.R.D. 535 (N.D. Cal. 2018) (No. 3:15-CV-03747-JD), aff'd sub nom. Patel v. Facebook, Inc., 932 F.3d 1264 (9th Cir. 2019).
    1. Id.
    1. .Defendant Facebook, Inc.’s Reply in Support of Motion to Dismiss Plaintiffs’ Second Amendment Consolidated Class Action Complaint at 11, In re Facebook, Inc. Internet Tracking Litigation, 263 F.Supp.3d 836 (N.D. Cal. 2017) (No. 5:12-md-02314 EJD).
    1. .Defendant Facebook, Inc.’s Motion to Dismiss Plaintiffs’ Second Amended Consolidated Class Action Complaint at 33, In re Facebook, Inc. Internet Tracking Litigation, 263 F.Supp.3d 836 (N.D. Cal. 2017) (No. 5:12-md-02314 EJD).
    1. .Google developed a privacy department as a result of a 2011 consent decree with the FTC. See Google Consent Decree, supra note 12. Facebook did the same in response to its 2011 agreement with the FTC. Agreement Containing Consent Order, at 5–6, In re Facebook, Inc., File No. 092 3184 (Nov. 29, 2011) [hereinafter Facebook Consent Decree], https://www.ftc.gov/sites/default/files/documents/cases/2011/11/111129facebookagree.pdf [https://perma.cc/Z6R7-9XRR].
    1. .Michelle Quinn & Tony Romm, Google Tells FTC of Privacy Progress, Politico (Feb. 10, 2012), https://www.politico.com/story/2012/02/google-tells-ftc-of-progress-on-privacy-072731 [https://perma.cc/48GD-FZYA]; see Google Consent Decree, supra note 12; Facebook Consent Decree, supra note 249.
    1. Facebook’s Commitment to Data Protection and Privacy in Compliance with the GDPR, Facebook Bus. (Jan. 29, 2018), https://www.facebook.com/business/news/facebooks-commitment-to-data-protection-and-privacy-in-compliance-with-the-gdpr [https://perma.cc/773N-FP69]; Warwick Ashford, Facebook Is Ready for GDPR, Says Zuckerberg, Comput. Wkly. (May 23, 2018), https://www.computerweekly.com/news/252441730/Facebook-is-ready-for-GDPR-says-Zuckerberg [https://perma.cc/AN9F-X4H2]; Ashley Rodriguez, Google Says It Spent “Hundreds of Years of Human Time” Complying with Europe’s Privacy Rules, Quartz (Sept. 26, 2018), https://qz.com/1403080/google-spent-hundreds-of-years-of-human-time-complying-with-gdpr/ [https://perma.cc/Y8GM-ZPE3].
    1. .Appellee’s Brief at 28, Davis v. Facebook, Inc., 956 F.3d 589 (9th Cir. 2020) (No. 17-17486) (arguing that users themselves had to “take[] steps to keep their browsing histories private”).
    1. .Solove, supra note 157, at 1880.
    1. E.g., George B. Shepherd & Morgan Cloud, Time and Money: Discovery Leads to Hourly Billing, 1999 U. Ill. L. Rev. 91, 103–04 (1999).
    1. See, e.g., Eric A. Posner, Law, Economics, and Inefficient Norms, 144 U. Pa. L. Rev. 1697, 1727 (1996) (“Highly unequal endowments of group members may be evidence of inefficient norms. The more powerful members may prefer and enforce norms that redistribute wealth to them, even when those norms are inefficient.”); Lloyd L. Weinreb, Custom, Law and Public Policy: The INS Case as an Example for Intellectual Property, 78 Va. L. Rev. 141, 146–47 (1992) (arguing that relying on custom will mean that “the better financed private interest” will win, “rather than a careful, systematic” rule that “will serve the community as a whole”).
    1. See Edelman, supra note 47, at 78–79 (discussing the impact of professional organizations and information resources in the human resources field).
    1. .Upcoming IAPP Conferences, IAPP, https://iapp.org/conferences/ [https://perma.cc/TFW9-7DAQ]; Priv. & Sec. Acad., https://www.privacysecurityacademy.com/ [https://perma.cc/795Z-VCHY].
    1. .Bamberger & Mulligan, Privacy On the Ground, supra note 126, at 80, 142.
    1. See Paul J. DiMaggio & Walter W. Powell, The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields, 48 Am. Socio. Rev. 147 (1983) (explaining how and why businesses in an industry evolve to look and behave in similar ways); Mark S. Granovetter, The Strength of Weak Ties, 78 Am. J. Soc. 1360, 1363–66 (1973) (discussing how information is spread through the connections that link individuals within their networks and to other networks).
    1. .This is called the “first-mover advantage.” See, e.g., Rajshree Agarwal & Michael Gort, First-Mover Advantage and the Speed of Competitive Entry, 1887-1986, 44 J. L. & Econ. 161, 173 (2001); William T. Robinson & Sungwook Min, Is the First to Market the First to Fail? Empirical Evidence for Industrial Goods Businesses, 39 J. Mktg. Rsch. 120, 126–27 (2002).
    1. .Indeed, as the IAPP and TrustArc recently found, budgetary constraints likely explain why many companies have not hired anyone to help with data mapping, data inventories, or privacy impact assessments, despite GDPR requirements. IAPP & TrustArc, Getting to GDPR Compliance: Risk Evaluation and Strategies for Mitigation 8–10 (2018), https://iapp.org/media/pdf/ resource_center/GDPR-Risks-and-Strategies-FINAL.pdf [https://perma.cc/W4JF-P5QA].
    1. .See Liam Tung, Struggling to Comply with GDPR? Microsoft 365 Rolls Out New Privacy Dashboards, ZDNet (Jan. 30, 2019), https://www.zdnet.com/article/struggling-to-comply-with-gdpr-microsoft-365-rolls-out-new-privacy-dashboards/ [https://perma.cc/6H3Z-BLGB].
    1. .Ari Ezra Waldman, Outsourcing Privacy, 96 Notre Dame L. Rev. Reflection 194, 196 (2021).
    1. See, e.g., Thomas M. Lenard & Paul H. Rubin, Tech. Pol’y Inst., The Big Data Revolution: Privacy Considerations 3, 26 (2013), https://techpolicyinstitute.org/wp-content/uploads/2013/12/the-big-data-revolution-privac-2007594.pdf [https://perma.cc/275M-V77Y]; April Dembosky & James Fontanella-Khan, US Tech Groups Criticized for EU Lobbying, Fin. Times (Feb. 4, 2013), https://www.ft.com/content/e29a717e-6df0-11e2-983d-00144feab49a [https://perma.cc/95KA-VKA8].
    1. .David Dayen, An Advocacy Group for Startups Is Funded by Google and Run by Ex-Googlers, Intercept (May 30, 2018), https://theintercept.com/2018/05/30/google-engine-advocacy-tech-startups/ [https://perma.cc/67EM-R6VA].
    1. See Examining Safeguards for Consumer Data Privacy, Hearing Before the S. Comm. on Com., Sci., & Transp., 115th Cong., 2d Sess. (2018) (including testimony from Keith Enright, Google’s Chief Privacy Office at the time, and Damien Kieran, Global Data Protection Officer and Associate General Counsel at Twitter, Inc.).
    1. .For example, former FTC Commissioner Jon Leibowitz left the FTC to co-chair the 21st Privacy Coalition, an industry-funded, anti-regulatory advocacy group, and became counsel at the law firm Davis Polk & Wardwell LLP, where he represented large corporations on antitrust and privacy matters. See Press Release, Brian E. Frosh, Maryland Attorney General, Former FTC Chair Jon Leibowitz to Join Office of Attorney General (Oct. 14, 2021), https://www.marylandattorneygeneral.gov/press/2021/101421.pdf [https://perma.cc/5Y3D-7RCK].
    1. See, e.g., Fed. Trade Comm’n, Mobile Privacy Disclosures: Building Trust Through Transparency 3 n.13 (2013), https://www.ftc.gov/sites/default/files/documents/ reports/mobile-privacy-disclosures-building-trust-through-transparency-federal-trade-commission-staff-report/130201mobileprivacyreport.pdf [https://perma.cc/Y3E4-5AVR].
    1. Id. at 13 n.62; see About, App Ass’n, https://actonline.org/about/ [https://perma.cc/U9R9-8UH2]
    1. *Fed. Trade Comm’n, supra* note 268, at 17 & n.81.
    1. Id. at 21 & n.92.
    1. .Fed. Trade Comm’n, Internet of Things: Privacy & Security in a Connected World 48–49 (2015), https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf [https://perma.cc/J7CM-NQFF]. The “Internet of Things” refers to a growing collection of electronic devices connected to the internet over Wi-Fi. Jacob Morgan, A Simple Explanation of ‘The Internet of Things, Forbes (May 13, 2014), http://www.forbes.com/sites/jacobmorgan/2014/05/13/simple-explanation-internet-things-that-anyone-can-understand/#4a6ee29b6828 [https://perma.cc/22V7-EN2C].
    1. .Google and Facebook are being investigated as monopolists. See Plaintiffs’ Complaint, U.S. v. Google, Inc., Case 1:20-cv-03010 (D.D.C. Oct. 20, 2020); Complaint for Injunctive and Other Equitable Relief, F.T.C. v. Facebook, Inc., No. 1:20-cv-03590 (D.D.C. Dec. 9, 2020) (public redacted version of document filed under seal).
    1. .Cohen, supra note 9, at 2–10.
    1. .This is part of a long research agenda on legitimizing procedures. See, e.g., Tom R. Tyler, Procedural Justice, Legitimacy, and the Effective Rule of Law, 30 Crime & Just. 283, 314–18 (2003); Jason Sunshine & Tom R. Tyler, The Role of Procedural Justice and Legitimacy in Shaping Public Support for Policing, 37 L. & Soc’y Rev. 513, 514 (2003); Danielle Keats Citron, Technological Due Process, 85 Wash. U. L. Rev. 1249, 1305, 1308–13 (2008) (calling for procedural governance mechanisms in administrative agency use of algorithmic systems).
    1. .Britton-Purdy, Grewal, Kapczynski & Rahman, supra note 22, at 1790.
    1. Id. at 1793.
    1. See, e.g., Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code 42–52 (2019); Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism 1–2, 29 (2018); Rashida Richardson, Jason M. Shultz & Kate Crawford, Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice, 94 N.Y.U. L. Rev. 192, 218 (2019).
    1. See Danielle Keats Citron, A New Compact for Sexual Privacy, 62 Wm. & Mary L. Rev. 1763, 1770 (2021) (“These risks are not evenly distributed across society. Women and marginalized communities disproportionately bear the burden of private-sector surveillance of intimate life . . . [f]or people with intersecting marginalized identities, the harm is compounded.”).
    1. .There is increasing recognition that the procedural safeguards that characterized much of privacy and technology scholarship are insufficient. See, e.g., Ryan Calo & Danielle Keats Citron, The Automated Administrative State: A Crisis of Legitimacy, 70 Emory L.J. 797 (2021) (distinguishing previous scholarship that sought to reimpose due process on administrative processes that use algorithms to make policy from more structural concerns about algorithms’ fundamental legitimacy); Frank Pasquale, The Second Wave of Algorithmic Accountability, L. & Pol. Econ. (Nov. 25, 2019), https://lpeblog.org/2019/11/25/the-second-wave-of-algorithmic-accountability/ (distinguishing between algorithmic accountability scholarship that seeks to ameliorate harms and assign responsibility and structural deficiencies of automated decision-making in government, generally).
    1. .Paul D. Butler, Poor People Lose: Gideon and the Critique of Rights, 122 Yale L.J. 2176, 2201 (2013) (“[P]rocedural rights may be especially prone to legitimate the status quo, because ‘fair’ process masks unjust substantive outcomes and makes those outcomes seem more legitimate.”).
    1. Id. at 2178–79.
    1. .Kaminski, supra note 4, at 1564 (noting that collaborative governance adds “soft” law mechanisms like negotiated settlements, legal safe harbors, and incorporation of industry standards to traditional regulatory modalities).
    1. See, e.g., David A. Dana, The New “Contractarian” Paradigm in Environmental Regulation, 2000 U. Ill. L. Rev. 35, 44–51 (comparing command-and-control to a site-specific negotiated form of governance).
    1. .Kaminski, supra note 4, at 1560.
    1. Id. at 1560–61.
    1. Id. at 1562.
    1. Id.
    1. See Douglas A. Kysar, Regulating from Nowhere: Environmental Law and the Search for Objectivity 100–05 (2010); Martha C. Nussbaum, The Costs of Tragedy: Some Moral Limits of Cost-Benefit Analysis, 29 J. Legal Stud. 1005, 1029–30 (2000); Amartya Sen, The Discipline of Cost-Benefit Analysis, 29 J. Legal Stud. 931, 936 (2000); Thomas O. McGarity, The Goals of Environmental Legislation, 31 B.C. Env’t Aff. L. Rev. 529, 551 (2004) (describing the Risk Assessment and Cost-Benefit Act of 1995, which would have required cost-benefit analyses in all regulatory programs); Cary Coglianese, The Managerial Turn in Environmental Policy, 17 N.Y.U. Env’t L.J. 54, 55–60 (2008) (describing managerialism in environmental law).
    1. .See Britton-Purdy, Grewal, Kapczynski, & Rahman, supra note 22, at 1811–12.
    1. .See Rory Van Loo, The New Gatekeepers: Private Firms as Public Enforcers, 106 Va. L. Rev. 467, 485–86 (2020) (demonstrating how CFPB regulators outsource regulation of third parties to banks); Rory Van Loo, Regulatory Monitors: Policing Firms in the Compliance Era, 119 Colum. L. Rev. 369, 397–98 (2019) (describing the role of internal compliance departments in financial regulation as a form of “collaborative governance”).
    1. .See Judith Resnick, Diffusing Disputes: The Public in the Private of Arbitration, the Private in Courts, and the Erasure of Rights, 124 Yale L.J. 2804, 2836–47 (2015).
    1. .See Edelman, supra note 47, at 13 (“Thus the meaning of law evolves over time in a way that is fundamentally influenced by the institutions that law is meant to regulate.”).
    1. .See Britton-Purdy et al., supra note 22, at 1801–13.
    1. See, e.g., Lobel, Renew Deal, supra note 125, at 385 (conceding that collaborative governance tools may be “used by management merely as mechanisms for monitoring, controlling, and exerting additional pressures on workers”).
    1. .Kaminski, supra note 4, at 1561.
    1. .Ari Ezra Waldman, Industry Unbound: The Inside Story of Privacy, Data, and Corporate Power 210–31 (2021); see generally Max Weber, Economy and Society: An Outline of Interpretive Sociology (Guenther Roth & Claus Wittich eds., 1922) (describing the corporate bureaucracy as effective at channeling work toward capitalistic ends).
    1. .See Waldman, supra note 297, at 144–48; IAPP, Benchmarking Privacy Management and Investments of the Fortune 1000: Report on Findings from 2014 Research 23 (2014), https://iapp.org/media/pdf/resource_center/2014_Benchmarking_Report.pdf [https://perma.cc/99J8-YT7Z] (showing that Fortune 1000 privacy leaders ranked “compliance” as the most important priority for the company).
    1. Id. at 32 (finding that 80 percent of privacy budgets are spent on salaries, legal counsel, software, and overhead, whereas other budget items like incident response, privacy-related monitoring, and privacy-related investigations comprise only 1-2 percent each); see also Andrew C. Inkpen & Eric W. K. Tsang, Social Capital, Networks, and Knowledge Transfer, 30 Acad. Mgmt. Rev. 146, 147–150 (2005) (demonstrating that budget shifting can undermine a corporate department’s authority).
    1. .IAPP, supra note 298, at 5, 23.
    1. .Ari Ezra Waldman, supra note 142, at 709–19 (2018).
    1. .Waldman, supra note 297, at 210–31.
    1. .Craig Silverman & Ryan Mac, Facebook Fired an Employee Who Collected Evidence of Right-Wing Pages Getting Preferential Treatment, BuzzFeed News (Aug. 6, 2020), https://www.buzzfeednews.com/ article/craigsilverman/facebook-zuckerberg-what-if-trump-disputes-election-results [https://perma.cc/P94C-LX6J].
    1. Id.
    1. .Noam Scheiber & Kate Conger, The Great Google Revolt, N.Y. Times (Feb. 18, 2020), https://www.nytimes.com/interactive/2020/02/18/magazine/google-revolt.html [https://perma.cc/8Z9D-YVXN.
    1. .Cade Metz & Daisuke Wakabayashi, Google Researcher Says She Was Fired over Paper Highlighting Bias in A.I., N.Y. Times (Dec. 3, 2020), https://www.nytimes.com/2020/12/03/technology/google-researcher-timnit-gebru.html [https://perma.cc/L9F9-ZDFG].
    1. E.g., Robert A. Gorman & Matthew W. Finkin, The Individual and the Requirement of “Concert” Under the National Labor Relations Act, 130 U. Pa. L. Rev. 286, 344 (1981).
    1. See supra Part II.
    1. See, e.g., Butler, supra note 16, at x-xi (suggesting that identity only emerges from performance).
    1. .See Christine Overdevest, Toward a More Pragmatic Sociology of Markets, 40 Theory & Soc’y 533, 539 (2011) (discussing reforms to economic models using destabilizing techniques).
    1. .Britton-Purdy, Grewal, Kapczynski & Rahman, supra note 22, at 1827.
    1. .Robert L. Hale, Freedom Through Law: Public Control of Private Governing Power 15 (1952).
    1. .Cohen, supra note 9, at 3–8.
    1. .Gorz, supra note 23, at 7–8.
    1. .Amna A. Akbar, Demands for a Democratic Political Economy, 134 Harv. L. Rev. F. 90, 112 (2020).
    1. .Britton-Purdy, Grewal, Kapczynski & Rahman, supra note 22, at 1821, 1824, 1827.
    1. .Mark Engler & Paul Engler, André Gorz’s Non-Reformist Reforms Show How We Can Transform the World Today, Jacobin Mag. (July 22, 2021), https://www.jacobinmag.com/2021/07/andre-gorz-non-reformist-reforms-revolution-political-theory [https://perma.cc/6MP7-C3RC].
    1. .Akbar, supra note 315, at 112–17.
    1. Id. at 115–16.
    1. Id. at 116.
    1. Id. at 115–16.
    1. Id. at 118.
    1. Id.
    1. . Id. at 106.
    1. Id. at 118.
    1. .Kenji Yoshino, Covering, 111 Yale L.J. 769, 868–69 (2002).
    1. See supra Part III.A.2.
    1. .Woodrow Hartzog & Neil Richards, The Surprising Virtues of Data Loyalty, 71 Emory L.J. (forthcoming 2022), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3921799. See also DATA § 207 (duty of care); COPRA § 101 (duty of loyalty).
    1. .Lior Jacob Strahilevitz, A Social Networks Theory of Privacy, 72 U. Chi. L. Rev. 919–88 (2005).
    1. .Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life 141 (2010).
    1. .Julie E. Cohen, What Privacy Is for, 126 Harv. L. Rev. 1904, 1905 (2013).
    1. .Neil Richards, Why Privacy Matters 3 (2021).
    1. .See generally Danielle Keats Citron, Sexual Privacy, 128 Yale L.J. 1870 (2019).
    1. Id. at 1874–75.
    1. .See, e.g., Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (2018).
    1. .Scott Skinner-Thompson, Privacy at the Margins 139–79 (2021).
    1. .See generally Khiara M. Bridges, The Poverty of Privacy Rights (2017).
    1. See Martha C. Nussbaum, Creating Capabilities: The Human Development Approach (2011).
    1. .Citron, supra note 333, at 1874 (“We are free only insofar as we can manage the boundaries around our bodies and intimate activities.”).
    1. .Viljoen, supra note 223, at 584.
    1. .See Britton-Purdy, Grewal, Kapczynski, & Rahman, supra note 22, at 1821; see also Akbar, supra note 315, at 112–17 (arguing for a “democratic political economy” where power is returned to “everyday people” and away from elites that “monopolize wealth and power”).
    1. .See Paul Ohm, Regulating at Scale, 2 Geo. L. Tech. Rev. 546, 546–47 (2018) (arguing that the exponential scale of some harms requires an approach different from linear regulation).
    1. .Agreement Containing Consent Order, In re Amazon.com, Inc., File No. 1923123 (F.T.C. Feb 2, 2021).
    1. .Shelley E. Kohan, Amazon’s Net Profit Soars 84% with Sales Hitting $386 Billion, Forbes (Feb. 2, 2021), https://www.forbes.com/sites/shelleykohan/2021/02/02/amazons-net-profit-soars-84-with-sales-hitting-386-billion/?sh=429340591334 [https://perma.cc/46ED-5NCD].
    1. .David Leonhardt, The Amazon Customers Don’t See, N.Y. Times (June 15, 2021), https://www.nytimes.com/2021/06/15/briefing/amazon-warehouse-investigation.html [https://perma.cc/F53M-TQT7].
    1. Id.
    1. Id.
    1. .Nilay Patel, Facebook’s $5 Billion FTC Fine Is an Embarrassing Joke, Verge (July 12, 2019), https://www.theverge.com/2019/7/12/20692524/facebook-five-billion-ftc-fine-embarrassing-joke [https://perma.cc/NHV6-MMA4].
    1. Id.
    1. Id.
    1. .MYOBA § 1352.
    1. .Dissenting Statement of Commissioner Rohit Chopra, at 7, Regarding Zoom Video Communications, Inc., Commission File No. 1923167, Fed. Trade Comm’n (Nov. 6, 2020), https://www.ftc.gov/system/files/documents/public_statements/1582914/final_commissioner_chopra_dissenting_statement_on_zoom.pdf [https://perma.cc/5XEH-3W98] (stating that previous FTC litigation contributed to “strong outcomes and important development of the law”).
    1. .See AMG Capital Mgmt., LLC v. F.T.C., 141 S. Ct. 1341, 1344 (2021) (holding that the FTC Act does not entitle the FTC to seek disgorgement and other equitable remedies). See also Julie E. Cohen, How (Not) to Write a Privacy Law, Knight First Amend. Inst. (Mar. 23, 2021) https://knightcolumbia.org/content/how-not-to-write-a-privacy-law [https://perma.cc/J2YT-ENDP].
    1. .Karen Weise, Amazon’s Profit Soars 220 Percent as Pandemic Drives Shopping Online, N.Y. Times (May 12, 2021), https://www.nytimes.com/2021/04/29/technology/amazons-profits-triple.html [https://perma.cc/RGM5-99SN].
    1. See Coxon v. S.E.C., 137 F. App'x 975, 976 (9th Cir. 2005) (stating the government need show “only a ‘reasonable approximation of profits causally connected to the violation’” and could do that with the help of expert testimony).
    1. .Rebecca Kelly Slaughter, Protecting Consumer Privacy in a Time of Crisis, Remarks of Acting Chairwoman Rebecca Kelly Slaughter, 2 Fed. Trade Comm’n (Feb. 10, 2021), https://www.ftc.gov/public-statements/2021/02/remarks-commissioner-rebecca-kelly-slaughter-future-privacy-forum [https://perma.cc/WMD6-7NQS].
    1. See, e.g., Rebecca Wexler, Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System, 70 Stan. L. Rev. 1343 (2018) (arguing for an end to trade secrecy protections for algorithmic systems used in sentencing).
    1. .Karen Hao, We Read the Paper that Forced Timnit Gebru out of Google. Here’s What It Says., MIT Tech. Rev. (Dec. 4, 2020), https://www.technologyreview.com/2020/12/04/1013294/ google-ai-ethics-research-paper-forced-out-timnit-gebru/ [https://perma.cc/4EHC-PU7A].
    1. .Paresh Dave & Jeffrey Dastin, Google Told Its Scientists to ‘Strike a Positive Tone’ in AI Research—Documents, Reuters (Dec. 23, 2020), https://www.reuters.com/article/us-alphabet-google-research-focus/google-told-its-scientists-to-strike-a-positive-tone-in-ai-research-documents-idUSKBN28X1CB [https://perma.cc/5XEN-ZYTM].
    1. .See Lauren B. Edelman, Stephen Petterson, Elizabeth Chambliss & Howard S. Erlanger, Legal Ambiguity and the Politics of Compliance: Affirmative Action Officers’ Dilemma, 13 L. & Pol’y 73, 78 (1991) (“Tension . . . is often found in enforcement positions in organizations” where agents are subject to “conflicting pressures.”).
    1. .Waldman, supra note 298, at 210–31.
    1. .Jones & Kaminski, supra note 229, at 110. Importantly, Kaminski and Jones do not claim that the GDPR is value-neutral or lacks substantive goals. See, e.g., GDPR, supra note 3, at art. 9; rec. 71.
    1. .Britton-Purdy, Grewal, Kapczynski, & Rahman, supra note 22, at 1824.
    1. See supra Part II.A.1.
    1. See, e.g., Cyber C.R. Initiative, www.cybercivilrights.org [https://perma.cc/R6AN-Q4DR]; Detroit Cmty. Tech. Project, https://www.detroitcommunitytech.org/ [https://perma.cc/XT3J-84LU]; Data 4 Black Lives, https://d4bl.org/ [https://perma.cc/3V76-B5E4]; Nat’l Network to End Domestic Violence, https://nnedv.org/ [https://perma.cc/DC5V-PNYS].
    1. .I recognize the complexity in identifying specific groups. Many groups that are subordinated by structures of power are also part of structures of power that oppress others. This is not only true because individuals possess intersectional identities that connect them to different positions in power relations, but also because data processing empowers some and not others. Suffice it to say, policymakers’ goal should not be to do what industry thinks is possible or preferable, but to listen to those who are harmed.
    1. .Danielle Citron proposed a cyber civil rights agenda more than a decade ago. See Danielle Keats Citron, Cyber Civil Rights, 89 B.U. L. Rev. 61, 61 (2009).
    1. .DATA § 301(b)(1).
    1. Id. at § 104 (shifting the burden of proof to the data aggregator to show absence of discrimination or other alternatives to discrimination).
    1. Id.
    1. .Brown Releases New Proposal that Would Protect Consumers’ Privacy from Bad Actors, Sherrod Brown: U.S. Senator of Ohio (June 18, 2020), https://www.brown.senate.gov/newsroom/press/release/brown-proposal-protect-consumers-privacy [https://perma.cc/HG6L-933M]; Statements by Privacy Experts and Civil Rights and Consumer Organizations, U.S. Senate Comm. Banking, Hous. & Urban Affs. (2020), https://www.banking.senate.gov/imo/media/doc/DATA%202020%20-%20statements%20by%20 organizations2.pdf [https://perma.cc/8JRS-9AJN].
    1. .Frank Pasquale, Licensing Big Data Analytics in an Era of Invasive and Contested AI, at 1 *(unpublished manuscript on file with author).
    1. Id.
    1. Id.
    1. Id. at *12.
    1. See supra Part III.B.3.
    1. See Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power 94, 195, 345–47 (2019).
    1. .Danielle Keats Citron & Daniel J. Solove, Privacy Harms, 102 B.U. L. Rev. (forthcoming 2022) (manuscript at 3), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3782222 [https://perma.cc/8W7H-TX8E].
    1. .Cohen, supra note 9.
    1. .Karen Hao, The UK Exam Debacle Reminds Us that Algorithms Can’t Fix Broken Systems, MIT Tech. Rev. (Aug. 20, 2020), https://www.technologyreview.com/2020/08/20/1007502/uk-exam-algorithm-cant-fix-broken-system/ [https://perma.cc/E5FG-ER42].
    1. See, e.g., Anita L. Allen, Uneasy Access: Privacy for Women in a Free Society (1998); Citron, supra note 333, at 1890–1897; Citron, supra note 367; Daniele Keats Citron & Mary Anne Franks, Criminalizing Revenge Porn, 49 Wake Forest L. Rev. 345 (2014).
    1. See, e.g., Bridges, supra note 337; Noble, supra note 278, at 1, 27–28; Benjamin, supra note 278.
    1. See, e.g., Skinner-Thompson, supra note 336.
    Previous
    Previous

    Reparative Justice in the U.S. Territories: Reckoning with America’s Colonial Climate Crisis

    Next
    Next

    The <em>Pennhurst</em> Doctrines and the Lost Disability History of the “New Federalism”