Regulating the “Most Accessible Marketplace of Ideas in History”: Disclosure Requirements in Online Political Advertisements After the 2016 Election
The libertarian regulatory environment of online political advertising has come under scrutiny again, as news reports continue to come out describing the extent of Russian interference with the 2016 presidential election. For years, Silicon Valley has resisted Washington, D.C.’s efforts to regulate online political advertising. Tech companies feared regulation would threaten not only their business models, but also the Internet’s status as the “most accessible marketplace of ideas in history.”[2] But can America’s democracy continue to tolerate lax regulation of online political advertising? Overwhelming evidence of Russian operatives spreading divisive messages across online platforms during the 2016 presidential election demands a government response. In fact, Congress is now debating the Honest Ads Act, and the Federal Election Commission is considering implementing regulations to increase the transparency of online political advertisements. With the specter of regulation, Facebook, Google, and Twitter have updated their policies governing online political advertising.
This Note argues that Congress should pass the Honest Ads Act, which requires disclosure for online political advertising and makes reasonable efforts to stop foreign interference with elections. Disclosure requirements are important because they provide information to voters, deter corruption, and facilitate enforcement of campaign laws. The Supreme Court has long upheld disclosure requirements, including in its controversial Citizens United v. Federal Election Commission decision. But disclosure requirements are likely only a partial solution. Laws and regulations may struggle to reach the trolls and bots that spread Russian disinformation during the 2016 presidential election without infringing the First Amendment. To rein in these bad actors, American democracy will have to rely on Silicon Valley to police online platforms rather than on Washington, D.C.
This Note first provides an overview of the development of disclosure requirements in the Supreme Court’s jurisprudence. It then describes the libertarian regulatory environment of online political advertising and Silicon Valley’s efforts to self-regulate in the wake of the 2016 presidential election. Finally, the Note advocates for the passage of the Honest Ads Act and discusses the limits of regulating online political advertising.
Table of Contents Show
Introduction
After the election, I made a comment that I thought the idea misinformation on Facebook changed the outcome of the election was a crazy idea. Calling that crazy was dismissive and I regret it. This is too important an issue to be dismissive.
–Mark Zuckerberg, September 27, 2017.[3]
Russian interference with the 2016 presidential election has renewed concerns over the lack of regulation governing online political advertising. As early as 2014, a Kremlin-linked Russian company called the Internet Research Agency started to interfere with the US political system.[4] Using false American personas, the Internet Research Agency and its operatives bought political advertisements from April to November 2016.[5] Its objective was illicit: to promote then-candidate Donald Trump and to denounce Hillary Clinton.[6] Its operatives posed as Americans while operating social media pages and groups, and even organized political rallies across the United States through public posts on their false American persona social media accounts.[7]
Nearly a year after the 2016 presidential election, Facebook, Twitter, and Google executives acknowledged to Congress that Russia’s disinformation campaign exploited their online platforms to influence the election.[8] Several members of Congress expressed frustration and disbelief with these companies’ lackadaisical reactions to Russian interference. At the Senate Intelligence Committee hearing on November 1, 2017, Senator Chris Coons asked, “Why has it taken Facebook 11 months to come forward and help us understand the scope of this problem, see it clearly for the problem it is, and begin to work in a responsible way to address it?”[9] Senator Dianne Feinstein chastised the executives: “I don’t think you get it . . . . What we’re talking about is the beginning of cyberwarfare. What we’re talking about is a major foreign power with sophistication and ability to involve themselves in a presidential election and sow conflict and discontent all over this country.”[10]
But for too long, Congress and the Federal Election Commission (FEC) have adopted a laissez-faire approach to online political advertising. For several years, tech industry lobbyists and lawyers successfully fought off proposed regulations.[11] As a result, “[n]o agency regulates political advertisements on the Internet with any real scrutiny, and disclosure around them severely lacks transparency.”[12] In addition, although federal law prohibits foreign nationals and entities from spending money to attempt to influence elections, it does not expressly ban them from spending money on online political advertising.[13] Professor Nathaniel Persily describes online campaigning as “the political equivalent of the Wild West without sheriffs.”[14]
This lack of regulation has opened a door for foreign spending on online political ads and fake accounts. During the 2016 presidential campaign, Facebook sold over $100,000 worth of ads to the Internet Research Agency, and Google sold more than $4,700 to Kremlin-linked accounts.[15] Facebook estimates that 126 million Facebook users saw political ads purchased by the Internet Research Agency.[16] The content of these Russian ads ranged from depictions of “Buff Bernie” promoting gay rights to Jesus arm-wrestling Satan who exclaims his support for Hillary Clinton.[17] Many seemed to touch on polarizing subjects and were intended to tilt the 2016 presidential election toward Donald Trump and away from Hillary Clinton.[18] In addition, Twitter reports that Russian-linked operatives controlled 2,752 accounts and that at least 36,000 “bots”—human-impersonating robots—tweeted 1.4 million times during the 2016 presidential campaign.[19] Google states that Russian-linked operatives uploaded more than 1,000 videos to its video platform, YouTube.[20]
Some believe the numbers reported by the online platforms are too low. During the Senate hearing, Senator Mark Warner questioned Twitter’s estimate: “Twitter seems to be vastly underestimating the number of fake accounts and bots pushing disinformation. Independent researchers have estimated that up to 15 percent of Twitter accounts—or potentially 48 million accounts—are fake or automated.”[21] The fact that Facebook revised its original estimate of Facebook users who saw Russia-linked content from 10 million to 126 million did not inspire confidence among lawmakers that these companies grasped the extent of the problem.[22]
In response to Russia’s disinformation campaign, Congress and the FEC have taken steps to regulate online political advertising. On October 19, 2017, Senators Amy Klobuchar, Mark Warner, and John McCain introduced the Honest Ads Act to enhance transparency in online political advertising. The Act requires Internet companies to disclose the identity of those buying online political advertisements.[23] Surprisingly, even the frequently divided FEC voted unanimously on November 16, 2017 to initiate the rulemaking process for Internet communication disclaimers.[24] However, the current political climate may stymie efforts to bring about any real reforms: a divided Congress seems unlikely to pass the Act and a FEC mired in partisanship appears unlikely to reform regulations.[25] Until Congress or the FEC acts, American democracy will have to rely on tech companies to police themselves.
This Note argues that Congress should pass the Honest Ads Act. It would introduce disclosure requirements for online political advertising and make reasonable efforts to thwart foreign interference with American elections. From Buckley v. Valeo to Citizens United v. Federal Election Commission, the Supreme Court has long upheld disclosure requirements.[26] Disclosure remains one of the few tools available to promote the public interest through campaign finance law.[27]
But enacting disclosure requirements is only a first step. Transparency allows online political ads to come under public scrutiny, but it may not entirely prevent foreign actors from exploiting online platforms to spread divisive messages. The fake accounts and bots that Russian operatives created may pose a larger danger than ads. Fake accounts can spread viral content, which can be more persuasive than ads in influencing people online.[28] Moreover, authentic accounts from anywhere in the world can post videos to disseminate propaganda.
This Note proceeds in four parts. Part I provides a historical overview of campaign finance case law with a particular focus on disclosure. Part II describes the “libertarian regulatory environment” for online political communication that enabled Russian-linked operatives to disrupt the 2016 presidential election.[29] It examines FEC efforts to regulate online political communication and efforts by tech industry lobbyists and lawyers to keep online political communication free from government regulation. Part III discusses how online platforms updated their internal policies as government officials threatened regulation. Part IV advocates for the enactment of the Honest Ads Act. It also considers challenges and limits to regulating online political ads.
I. The Development of Disclosure Requirements
The first federal disclosure law went into effect in 1910.[30] Since then, disclosure requirements have withstood First Amendment challenges.[31] In 1976, Buckley upheld the disclosure requirements for campaign expenditures and contributions in the Federal Election Campaign Act of 1971 (FECA).[32] As Buckley identified, disclosure requirements serve three governmental interests: (1) providing information to voters, (2) discouraging corruption, and (3) facilitating enforcement of campaign laws.[33] More recently, in 2010, the Supreme Court voted 8-1 in Citizens United to uphold the disclosure requirements in the Bipartisan Campaign Reform Act of 2002 (BCRA)—despite splitting 5-4 on the controversial independent expenditure question.[34]
The main constitutional challenge to disclosure comes “from its potential chilling effect on speech and association.”[35] If people expect that they must disclose their identities, some may be deterred from participating in an activity, like “voting, joining a political party, signing a petition, circulating a pamphlet, or contributing or spending campaign money.”[36] Disclosure may stir up fears of retaliation or privacy concerns.[37] Below is an overview of the modern development of disclosure requirements in the Supreme Court’s jurisprudence from Buckley to Citizens United.
A. From Buckley to McIntyre
In Buckley, the Supreme Court found FECA’s disclosure provisions constitutional.[38] Generally, FECA mandated political committees to register with the FEC, maintain records of contributions and expenditures, and file quarterly reports to the FEC with the name, address, occupation, and principal place of business of any person who contributed over $100.[39] It also required any individual or group, other than a political committee or candidate, to report any contributions or expenditures exceeding $100.[40]
After acknowledging that compelled disclosure “has the potential for substantially infringing the exercise of First Amendment rights,” the Court upheld these disclosure provisions under “exacting scrutiny,” which requires a “substantial relation” between disclosure and a “sufficiently important” governmental interest.[41] The Court identified three governmental interests “sufficiently important to outweigh the possibility of infringement, particularly when the ‘free functioning of our national institutions’ is involved.”[42] First, disclosure gives information to voters that aids them in evaluating federal candidates.[43] Information regarding the origin of campaign money “alert[s] the voter to the interests to which a candidate is most likely to be responsive and thus facilitate[s] predictions of future performance in office.”[44] Second, disclosure discourages corruption and avoids the appearance of corruption.[45] It allows the public to track any connections between campaign contributors and their candidates to “detect any post-election special favors that may be given in return.”[46] Third, disclosure facilitates the data collection needed to provide enforcement of campaign finance laws, such as contribution limits.[47]
Despite these three governmental interests—information, anticorruption, and enforcement—the Court in Buckley acknowledged that disclosure, in some cases, could subject minor parties or independent candidates to retaliation or harassment.[48] Appellants argued that just as the Court shielded the NAACP’s membership records from Alabama’s attempt to compel disclosure, the Court should establish a blanket exemption for minor parties and independent candidates.[49] Buckley rejected a blanket exemption for these actors, but it permitted an exemption if they offered evidence showing “a reasonable probability that the compelled disclosure of a party’s contributors’ names will subject them to threats, harassment, or reprisals from either government officials or private parties.”[50] A later Supreme Court case, Brown v. Socialist Workers ‘74 Campaign Committee, recognized this exemption and found that an Ohio disclosure statute could not apply to the Socialist Workers Party after a showing of harassment and threats against party members.[51] Subsequent Supreme Court cases mirrored Buckley’s acceptance of disclosure requirements and demonstrated the application of this exemption.[52]
But the disclosure jurisprudence shifted to some extent in McIntyre v. Ohio Elections Commission.[53] McIntyre struck down an Ohio statute that mandated the disclosure of the author’s name and address on campaign literature for candidates or ballot propositions.[54] Mrs. McIntyre disseminated unsigned pamphlets expressing her opposition to a ballot referendum and the Ohio Elections Commission fined her for violating Ohio’s disclosure statute.[55] The Court expounded on the importance of anonymous political speech during a campaign: “[A]n author’s decision to remain anonymous, like other decisions concerning omissions or additions to the content of a publication, is an aspect of the freedom of speech protected by the First Amendment.”[56] Because the disclosure statute burdened core political speech, the Court again applied “exacting scrutiny” to determine whether the disclosure requirement was “narrowly tailored to serve an overriding state interest.”[57] The Court in McIntyre found no state interest justifying the infringement of Mrs. McIntyre’s First Amendment right to practice anonymous political speech.
The Court held that “anonymous pamphleteering is not a pernicious, fraudulent practice, but an honorable tradition of advocacy and of dissent.”[58] It added that “[a]nonymity is a shield from the tyranny of the majority” and “thus exemplifies the purpose behind the Bill of Rights” and the “First Amendment in particular.”[59] This purpose was “to protect unpopular individuals from retaliation—and their ideas from suppression—at the hand of an intolerant society.”[60]
Significantly, the Court distinguished the Ohio statute in McIntyre from FECA in Buckley.[61] FECA governed solely “candidate elections, not referenda or other issue-based ballot measures.”[62] The McIntyre Court stated that during candidate elections, there is likely “a compelling state interest in avoiding the corruption that might result from campaign expenditures.”[63] In that context, “[d]isclosure of expenditures lessens the risk that individuals will spend money to support a candidate as a quid pro quo for special treatment after the candidate is in office.”[64]
B. From McConnell to Citizens United
In two cases, McConnell v. FEC and Citizens United v. FEC, the Supreme Court continued to uphold disclosure requirements. But before delving into these cases, it is important to provide some background on BCRA, the statute at issue in McConnell and Citizens United.
Before the passage of BCRA, FECA’s disclosure requirements covered “express advocacy” ads but not “issue advocacy” ads.[65] Express advocacy refers to communications that expressly advocate for or against federal candidates.[66] In a famous footnote, Buckley distinguished express advocacy from issue advocacy.[67] It stated that the “use or omission of ‘magic words’ . . . marked a bright statutory line” differentiating the two categories of advertising.[68] “Magic words” that signaled express advocacy included “communications containing express words of advocacy of election or defeat, such as ‘vote for,’ ‘elect,’ ‘support,’ ‘cast your ballot for,’ ‘Smith for Congress,’ ‘vote against,’ ‘defeat,’ ‘reject.’”[69] But the bright line was not always bright. Either express or issue advocacy ads could “advocate the election or defeat of clearly identified federal candidates, even though the so-called issue ads eschewed the use of magic words.”[70]
One consequence of this unclear distinction was the use of “sham issue advocacy.” Sham issue advocacy refers to ads that look like “express advocacy” but do not use the “magic words” of express advocacy.[71] Without these “magic words,” these ads fell outside the scope of FECA’s disclosure requirements, which only covered express advocacy.[72]
One problem BCRA sought to resolve was sham issue advocacy.[73] Title II of BCRA created the term “electioneering communication” to close the issue advocacy loophole.[74] Whereas FECA’s disclosure requirements reached only express advocacy, BCRA defined the term “electioneering communication” to encompass any broadcast, cable, or satellite communication that:
(I) refers to a clearly identified candidate for Federal office;
(II) is made within—
(aa) 60 days before a general, special, or runoff election for the office sought by the candidate; or
(bb) 30 days before a primary or preference election, or a convention or caucus of a political party that has authority to nominate a candidate, for the office sought by the candidate; and
(III) in the case of a communication which refers to a candidate for an office other than President or Vice President, is targeted to the relevant electorate.[75]
That is, BCRA prohibited corporations or unions from spending treasury money on television or radio if they merely mentioned a candidate’s name and targeted the relevant electorate before an election.[76]
Although Citizens United eventually struck down Title II of BCRA, BCRA’s disclosure requirements survived both McConnell and Citizens United.[77] First, BCRA section 311 requires anyone, besides a candidate, who is funding a televised electioneering communication to include a disclaimer that “__ is responsible for the content of this advertising.”[78] The disclaimer must be delivered in a “clearly spoken manner” and shown on the screen in a “clearly readable manner” lasting at least four seconds.[79] It must say that the communication “is not authorized by any candidate or candidate’s committee” and must present “the name and address (or Web site address) of the person or group that funded the advertisement.”[80] Second, BCRA section 201 requires any individual who spends in excess of $10,000 on electioneering communications to file a disclosure statement with the FEC.[81] The disclosure statement must identify the person spending the money, the amount spent, the election the communication targeted, and the names of particular contributors.[82]
McConnell held that BCRA sections 201 and 311 were constitutional, explaining that they would help citizens “make informed choices in the political marketplace,” even if disclosure requirements may infringe on the freedom of speech.[83] In McConnell, the Court concluded that BCRA’s disclosure requirements furthered the government interests of “[1] providing the electorate with information, [2] deterring actual corruption and avoiding any appearance thereof, and [3] gathering the data necessary to enforce more substantive electioneering restrictions.”[84] The factual record demonstrated that independent groups ran campaign advertisements “while hiding behind dubious and misleading names.”[85]
Citizens United upheld BCRA’s disclosure requirements against an as-applied challenge. Citizens United was a nonprofit that distributed a documentary about presidential candidate Hillary Clinton through a video-on-demand channel available free of charge to viewers.[86] To publicize the film, Citizens United created three ads for broadcast and cable television, with each ad including a brief statement about then-Senator Clinton, the title of the film, and the film’s website.[87] Citizens United argued that BCRA’s disclaimer and disclosure requirements were unconstitutional as applied to the film and the three ads.[88]
The Court disagreed, holding that BCRA’s disclosure requirements were constitutional and justified by the governmental interest in providing information.[89] The ads constituted “electioneering communications” because they identified Clinton by name shortly preceding a primary election and had “pejorative references to her candidacy.”[90] The required disclaimers gave voters information and ensured that they were “fully informed” of the individual or group speaking.[91] The Court stated, “At the very least, the disclaimers avoid confusion by making clear that the ads are not funded by a candidate or political party.”[92] Similarly, disclosure helped inform the public about “who is speaking about a candidate shortly before an election.”[93]
The Court also rejected the as-applied challenge because Citizens United failed to show that its members faced threats or retaliation.[94] McConnell had reaffirmed that compelled disclosure would be unconstitutional as applied to a group “if there were a reasonable probability that the group’s members would face threats, harassment, or reprisals if their names were disclosed.”[95] Citizens United argued that disclosure requirements “can chill donations to an organization by exposing donors to retaliation.”[96] But Citizens United had disclosed its contributors for years and never identified an “instance of harassment or retaliation.”[97]
As these cases illustrate, the Supreme Court has long accepted disclosure requirements in campaign finance laws and regulations. Even Citizens United— frequently invoked as a divisive decision—considered disclosure a safe regulation. Looking forward, the FEC or Congress should introduce similar disclosure requirements to online political advertising. As the 2016 election demonstrated, unregulated online political communication can attract disruptors.
II. The Libertarian Regulatory Environment
The FEC and Congress have generally taken a laissez-faire approach to online political communication with only a few exceptions.[98] FECA and FEC regulations require disclaimers for a political committee’s public communications that expressly advocate for a federal candidate or solicit contributions.[99] Moreover, FEC regulations require disclaimers for emails that political committees send to over 500 people, for websites that political committees make available to the public, and for Internet advertising that political committees pay for on other people’s websites.[100] Aside from these exceptions, “online communications are uniquely unregulated.”[101]
Currently, under FECA and FEC regulations, a political committee that makes a public communication must include a disclaimer.[102] A “public communication” encompasses “any broadcast, cable, or satellite communication, newspaper, magazine, outdoor advertising facility, mass mailing, or telephone bank to the general public, or any other form of general public political advertising.”[103] Notably absent from this list is online communication. FEC regulations provide that “general public political advertising” does not “include communications over the Internet, except for communications placed for a fee on another person’s Web site.”[104] That is, paid Internet advertising, such as paying to “place a banner, video, or pop-up advertisement on another person’s website,” requires a disclaimer.[105]
Disclaimer rules vary depending on the purchaser and authorizer of the ad. If “a candidate, an authorized committee of a candidate, or an agent of either” purchases and authorizes the public communication, the disclaimer must declare “that the communication has been paid for by the authorized political committee.”[106] If “a candidate, an authorized committee of a candidate, or an agent of either” authorizes the public communication, but someone else purchases it, the disclaimer must name the purchaser and state who authorized the public communication.[107] If “a candidate, an authorized committee of a candidate, or an agent of either” does not authorize a public communication, “the disclaimer must clearly state the full name and permanent street address, telephone number, or World Wide Web address of the person who paid for the communication, and that the communication is not authorized by any candidate or candidate’s committee.”[108]
In opposing regulation of online political ads, tech companies typically invoked two exceptions to these general disclaimer requirements. First, FEC regulations do not mandate disclaimers for “small items,” such as “bumper stickers, pins, buttons, pens, and similar small items upon which the disclaimer cannot be conveniently printed.”[109] Second, FEC regulations provide an exception for “impracticable” items: “Skywriting, water towers, wearing apparel, or other means of displaying an advertisement of such a nature that the inclusion of a disclaimer would be impracticable.”[110]
Below is a brief historical overview of how the FEC largely left online political communication unregulated.[111] In particular, it details how some of the FEC’s limited efforts to regulate ultimately failed. While some FEC Commissioners have advocated for modernizing campaign finance regulations to adapt to the rise of online advertising, others have resisted.[112]
A. The FEC’s Initial Efforts to Regulate Online Political Communication
In 2002, when the FEC issued regulations to implement BCRA provisions, it excluded online political communication from the definition of “public communication” and thus from the disclaimer rules governing “public communication.”[113] BCRA did not explicitly specify Internet communication as a form of “public communication” that came within the scope of disclaimer rules.[114] Still, the FEC applied disclaimer rules to two types of online political communication: (1) unsolicited emails that political committees send to more than 500 people and (2) websites that political committees make available to the public.[115] In 2004, in Shays v. Federal Election Commission, a US District Court found the FEC’s blanket exclusion of online political communication from “public communication” impermissible.[116] Thereafter, the FEC initiated rulemaking to respond to the court decision.[117]
In 2006, the FEC promulgated the so-called “internet exemption rule.”[118] The rule amended the definition of “public communication” to include paid Internet advertising on someone else’s website:
Public communication means a communication by means of any broadcast, cable, or satellite communication, newspaper, magazine, outdoor advertising facility, mass mailing, or telephone bank to the general public, or any other form of general public political advertising. The term general public political advertising shall not include communications over the Internet, except for communications placed for a fee on another person’s Web site.[119]
The FEC explained that paid Internet advertising resembled mass mailing, especially given the growing popularity of Internet advertising.[120] As the public relied increasingly on “the Internet for information and entertainment,” advertisers took advantage of the Internet’s “new marketing opportunities.”[121] BCRA did not specify the Internet as a form of “public communication,” but the FEC concluded that an Internet communication is a “public communication” solely if “it is a form of advertising and therefore falls within the catch-all category of ‘general public political advertising.’”[122] The FEC looked to dictionaries to explain that the word “advertising” suggests “a communication for which a payment is required, particularly in the context of campaign messages.”[123] It then distinguished “blogs and other websites” people use for free to communicate with the public from “[c]ommunications placed for a fee on another person’s website.”[124] Therefore, because paid Internet advertising on another’s website is “general public political advertising” and thus “public communication,” such advertising fell within the scope of the disclaimer rules governing “public communication.”[125]
The FEC justified its Internet exemption rule by describing the Internet “as a unique and evolving mode of mass communication and political speech that is distinct from other media in a manner that warrants a restrained regulatory approach.”[126] It elaborated:
The Internet’s accessibility, low cost, and interactive features make it a popular choice for sending and receiving information. Unlike other forms of mass communication, the Internet has minimal barriers to entry, including its low cost and widespread accessibility. Whereas the general public can communicate through television or radio broadcasts and most other forms of mass communication only by paying substantial advertising fees, the vast majority of the general public who choose to communicate through the Internet can afford to do so.[127]
In addition, mindful of “the important purpose of BCRA in preventing actual and apparent corruption,” the FEC recognized that no evidence exists that Internet activities pose “any significant danger of corruption or the appearance of corruption.”[128]
In 2010 and 2011, the FEC took two noteworthy actions that continued the FEC’s “restrained regulatory approach” toward online political communication.[129] First, in a 2010 Advisory Opinion (AO), the FEC found that Google need not include disclaimers on text ads generated when users performed searches on Google’s search engine.[130] Google’s AdWords program created text ads based on keywords an advertiser selected.[131] Text ads included “a headline which can consist of up to 25 characters, and two lines of text and a display Uniform Resource Locator (‘URL’) which can consist of up to 70 characters.”[132] When a user inputted search terms that corresponded with the selected keywords in the search engine, AdWords created text ads that appeared next to the search results.[133] Google sought to sell text ads to candidates and political committees without displaying a disclaimer identifying “who authorized or paid for the ad.”[134] Instead, a disclaimer appeared on the advertiser’s website after a user clicked on the ad.[135]
In a 4-2 vote, the FEC issued an AO concluding that Google’s conduct did not violate FECA or FEC regulations.[136] However, the majority split in its rationale.[137] Three commissioners found Google in compliance because the text ads show “the URL of the political committee’s website and the [advertiser’s website] contains a full disclaimer as required by 11 CFR 110.11.”[138] These commissioners recognized that “[i]ncluding the full name of the political committee could require more characters for the disclaimer than are allowed for the text ad itself.”[139] In a separate opinion, the fourth member of the majority saw no violation because text ads fell under the “impracticable” exception from disclaimer requirements.[140] When it adopted this AO, the FEC invited other online ad providers to request AOs to determine whether they needed to include a disclaimer in ads.[141]
The FEC’s second action came in response to Facebook’s reply to this invitation. In 2011, Facebook requested an AO to confirm whether its “small, character-limited ads qualif[ied] for the ‘small items’ and ‘impracticable’ exceptions” to disclaimer requirements.[142] Facebook sold two types of ads on its platform: standard ads and “Sponsored Stories.”[143] A standard ad included a miniature image, but was limited to “25 characters to utilize in the ad’s title and 135 characters in the ad’s body.”[144] Facebook users could “like” the ad purchaser’s Facebook “Page” to broadcast their endorsement of the ad purchaser’s Page on their “News Feed.”[145] To indicate that someone purchased the ad, Facebook placed the word “Sponsored” in the upper left-hand corner of the ad.[146]
Sponsored Stories appropriated “free” content from the ad purchaser’s Page and showed it to targeted Facebook users as an ad.[147] For example, when a purchaser bought a “Page Like” ad, users would notice a Sponsored Story showing that their Facebook Friends “like” a Page.[148] Sponsored Stories were tinier than standard ads.[149] They were character-limited too, displaying only up to 100 characters.[150] To indicate that someone purchased the ad, Facebook placed the word “Sponsored” in the upper left-hand corner of the ad.[151]
The FEC deadlocked 3-3 on Facebook’s AO request and, as a result, the FEC did not issue an AO.[152] Three commissioners concluded that these ads did not fall under the “small items” or “impracticable” exceptions to disclaimer requirements.[153] With respect to the “small items” exception, these commissioners thought no “physical limitations of the display medium or Internet technology” mandated character-limited Facebook ads.[154] It was “physically and technologically possible” to increase the ad size and character limit.[155] These commissioners distinguished the technological capabilities of Internet ads from the “small items” enumerated in the regulation: “Internet ads are not similar to bumper stickers, pins, buttons, or pens, nor do they involve ‘printed’ disclaimers. . . . Internet ads may include rollover displays, links, or other technological means of providing additional information, such as statutorily mandated disclaimers.”[156] Regarding the “impracticable” exception, these commissioners stated again that no “physical or technological limitations” of the “display medium or Internet technology” made it impracticable to include a disclaimer in a Facebook ad.[157]
The other three commissioners came to a different conclusion, finding that Facebook ads fell within the “impracticable” exception and thus did not need to include disclaimers.[158] With the character limit of standard ads, disclaimers would take up a significant number of characters available to ad purchasers.[159] Due to their smaller size, Sponsored Stories would not be able to
“accommodate any type of additional disclaimer,” because an ad purchaser could not choose to include any additional text in the ad.[160] These commissioners recognized “it may be technologically possible for Facebook to modify the character limitations,” but noted that the FEC’s disclaimer exceptions “take an entity’s existing advertising model as it is.”[161]
As the FEC published no AO on the matter and could not answer Facebook’s request, Facebook “proceeded as if it was exempt from the disclaimer requirement.”[162] However, the FEC still attempted to resolve whether online ads should include disclaimers and considered initiating rulemaking. When Google and Facebook requested AOs in 2010 and 2011, the FEC received one public comment asking the FEC to initiate the rulemaking process for Internet disclaimer requirements “in light of technological developments in Internet advertising.”[163] So, in 2011, the FEC approved an Advance Notice of Proposed Rulemaking (ANPRM), seeking comment on revising the rules governing Internet disclaimers in 11 C.F.R. 110.11.[164] In response to the ANPRM, the FEC received only seven comments, and did not promulgate a rule.[165] Notably, Facebook commented on the ANPRM and resisted rulemaking, exhorting the FEC “not to stand in the way of innovation.”[166]
B. Sounding the Alarm
In late 2014, then-Vice Chair Ann Ravel began to alert the public of the dangers of an unregulated Internet advertising industry.[167] She cautioned that regulations have not kept pace with “modern technological phenomena like social media, YouTube and bots.”[168] In October 2015, she even “warned that Vladimir Putin could meddle in our elections.”[169] Under the existing, “antiquated” regulatory regime, regulators and citizens could not “determine if the funding for a political advertisement online came from a domestic source or an enemy abroad.”[170]
A kerfuffle started in October 2014 between Ravel and other FEC commissioners. The FEC deadlocked again 3-3 on a complaint filed against a dark money group that distributed two political attack ads on YouTube.[171] The group, Checks and Balances, did not dispute that it created and released the videos, but denied that the videos violated FECA because they “were run only on the Internet [i.e., YouTube]” and therefore “required no disclaimer and no reporting to the FEC.”[172] In response to the complaint, then-Vice Chair Ravel issued a Statement of Reasons, urging a reappraisal of the FEC’s regulatory approach toward online political advertising.[173] She warned that while online political advertising is now prevalent, the FEC failed to adapt to this change and needed to recognize the significance of transparency “no matter what the medium of political communication.”[174] As the FEC’s earlier efforts to regulate online political communication received “only limited feedback” from the tech community, Ravel called on the FEC to reach out to tech companies to ensure the commission developed sound policy.[175]
The Republican commissioners claimed that “increased transparency in internet political advertising was censorship.”[176] They argued that mandating financial disclosure “could threaten the continued development of the internet’s virtual free marketplace of political ideas and democratic debate.”[177] The day after Ravel issued her statement, Commissioner Lee Goodman appeared on Fox News to criticize it.[178] Goodman ignored Ravel’s emphasis on online political advertising, which by definition is a paid communication, and implied that Ravel wanted unpaid writings on the Internet to come under the FEC’s scrutiny: “If we start regulating free YouTube posts, I want you to see what we’d be doing . . . . We would be regulating the speech itself and not the expenditure for speech. And so I don’t think we have the regulatory authority to do that.”[179]
On October 18, 2016, the FEC reopened the comment period for the 2011 ANPRM “in light of legal and technological developments since the notice was published.”[180] Since the 2011 ANPRM, the FEC had considered the disclaimer issue in two “new factual contexts,” but had failed to reach a majority decision.[181] First, in 2014, the FEC deadlocked 3-3 and was unable to issue an AO on whether mobile phone banner ads were exempt from disclaimer requirements.[182] Second, in 2016, the FEC again deadlocked 3-3 and was unable to resolve whether the Twitter profiles and tweets of political candidates and parties required disclaimers.[183] During this reopened comment period in late 2016, the FEC received only six comments and did not promulgate a rule.[184]
On February 19, 2017, Ravel announced her intention to resign from the FEC due to its dysfunction.[185] In a radio interview, Ravel named a few factors for this dysfunction.[186] One was the structure of the FEC.[187] In an agency where six commissioners sit, no more than three commissioners can come from the same political party, and four affirmative votes are required to act on a matter.[188] Usually, the FEC sits three Republican commissioners and three Democratic commissioners.[189] Historically, this division worked despite partisan affiliations.[190] According to Ravel, “[R]ecently, the commissioners—mostly on the Republican side—have operated as a bloc to ensure that you can never get four votes to either investigate matters, to do regulations, to explain to the public how to comply with the law—even to appoint a general counsel.”[191] On her way out, Ravel published a report highlighting the gridlock at the FEC.[192] Entitled “Dysfunction and Deadlock: The Enforcement Crisis at the Federal Election Commission Reveals the Unlikelihood of Draining the Swamp,” her analysis showed that “the rate of deadlocked votes blocking ‘substantive’ enforcement actions against possible campaign violations has reached a new high of 37.5 percent.”[193]
III. Responding to Election Interference
Despite this dysfunction, the prospect of government regulation has not disappeared. Prior to the 2016 election, Professor Nathaniel Persily predicted “the principal regulator of [online] political communication will not be a government agency but rather the internet portals themselves.”[194] However, he noted, “Regulation will and does happen.”[195] Indeed, the increased news coverage of Russian-linked entities purchasing online political ads to influence the 2016 election has increased the prospect of the government stepping in to regulate online political communication. Facebook, Google, and Twitter have come under pressure to act. In response, these companies have expressed a willingness to cooperate with government efforts to regulate and have developed internal policies to bring more transparency to online political communication on their platforms.
A. Self-Regulation: Online Platforms Create Their Own Policies
At first, major online platforms, like Facebook, largely ignored their role in enabling Russian operatives to spread disinformation in the 2016 election.[196] Mark Zuckerberg responded to criticism of Facebook’s role in the election by noting that people on all sides are upset because “that’s what running a platform for all ideas looks like.”[197] He also contended that Facebook positively impacted the 2016 election by giving more people a “voice” while deemphasizing the role disinformation played.[198] But as Senator Chris Coons noted, it took Facebook eleven months to help Congress understand the scope of Russian interference.[199] Mindful of public pressure and their scheduled testimony before Congress, Facebook, Google, and Twitter began to announce new policies designed to strengthen transparency on their platforms.
1. Facebook
A few days prior to its appearance before Congress in 2017, Facebook announced an update on its efforts to improve transparency in all advertising on its platform.[200] According to Facebook, by the 2018 midterm elections, it would require ad purchasers to associate their ads with a Facebook Page, where users would be able see the advertiser’s ads on that Page.[201] In addition, Facebook intended to establish “an archive of federal-election related ads so that [it could] show both current and historical federal-election related ads.”[202] For each political ad, Facebook would record the ad in a searchable archive, include the amount of money spent, and provide information on the audience (e.g., demographic information).[203]
Facebook planned to demand additional documentation from political advertisers.[204] Part of that documentation process might require advertisers to “identify that they are running election-related advertising and verify both their entity and location.”[205] Facebook would then require advertisers to incorporate a disclaimer in political ads that says, “Paid for by.”[206] Clicking on the disclaimer would direct users to details about the advertiser and explain “why [the user] saw that particular ad.”[207] In April 2018, Facebook clarified that it would extend these requirements for advertisements to anyone not only seeking to show electoral ads but also “issue ads.”[208] It also announced that it would verify the identity and location of anyone who runs Pages with a large following.[209]
2. Google
Like Facebook, Google announced its intent to strengthen its transparency in 2017. Google indicated it would require all advertisers creating election-related ads on Search, YouTube, and Display to identify themselves on Google’s “Why This Ad” icon.[210] Google users would be able to click the “Why this Ad” icon, which is found on all ads, to understand why they saw an ad.[211] Google would require ad purchasers to give information about the ad sponsor and would include that information in the “Why This Ad” information screen for election-related ads.[212] Google noted that “because the icon and its self-identifying information can appear as part of any ad of any size, this type of solution promotes accountability and ensures foreign nationals and other bad actors will have less ability to go unnoticed when interfering in US elections or disseminating false information.”[213]
3. Twitter
Twitter also publicized its efforts to improve transparency on its platform in 2017. It announced the launch of its “Transparency Center,” where users could see: “[1] All ads that are currently running on Twitter, including Promoted-Only ads; [2] How long ads have been running; [3] Ad creative associated with those campaigns; [4] Ads targeted to [particular users], as well as personalized information on which ads [users] are eligible to receive based on targeting.”[214] Twitter would mark electioneering ads with a visual indicator and would designate a section of the Transparency Center for electioneering ads. [215] This section would show all ads on Twitter, the amount spent on each ad, identifying information of the organization paying for the ad, “targeting demographics,” and the “historical data” of each advertiser.[216] Moreover, Twitter announced its intention to work with its partners to improve transparency around issue ads.[217]
B. Changing Their Tune: Tech Welcomes Regulation
Ultimately, Facebook, Google, and Twitter’s new policies failed to deflect the government’s focus on online political communication. As these tech companies began to increase their self-policing efforts, the FEC also began to initiate regulatory efforts. But politics may hinder the FEC from enacting meaningful change any time soon.
On October 10, 2017, due to ongoing news coverage of Russian interference in the 2016 election, the FEC reopened the comment period again for the 2011 ANPRM.[218] Despite lobbying against regulation of online political communication in the past, Facebook, Google, and Twitter submitted comments to the notice indicating general support for greater regulation.[219] One reason for this change of heart was a desire to hold all tech companies to the same standards. As Facebook pointed out, its self-policing efforts “could have the unintended consequence of pushing purchasers who wish to avoid disclosure to use other, less transparent platforms.”[220] In their comments, the tech companies emphasized a need for the FEC to keep the nature of their websites in mind as the FEC began to draft regulations.[221] Soon after, in a surprising unanimous decision for a highly partisan body, the FEC moved to start the rulemaking process to require disclaimers for small, character-limited online political ads.[222]
1. Facebook
In its comment during the 2017 reopened comment period, Facebook welcomed further guidance from the FEC concerning disclaimers in online political communication.[223] Facebook emphasized its firm commitment to transparency, but encouraged the FEC to adopt regulations that would give advertisers flexibility to meet their disclaimer obligations.[224] Flexibility would allow FEC regulations to remain relevant in a “dynamic environment.”[225] For example, Facebook ads have significantly evolved since 2011, when Facebook requested an AO concerning its small, character-limited ads; now some ads feature videos or “scrolling carousels of images.”[226] Moreover, Facebook supported expanding the disclaimer requirement for “electioneering communication” to include “digital or online communications that mention federal candidates and are run during the 30- or 60-day pre-election periods.”[227]
2. Google
Like Facebook, Google also “strongly support[ed]” the FEC’s proposal to initiate the rulemaking process.[228] It encouraged the FEC (1) to “provide clarity” to political advertisers about whether disclaimers are necessary for digital ads they purchase; (2) to maintain the role the Internet plays in the marketplace of ideas; and (3) to “promote transparency and accountability” to produce an informed citizenry and prevent groups from hiding behind disingenuous names.[229] Similar to Facebook, Google stressed how online ads have significantly evolved since 2010. For example, advertisers can now order Google “smart” ads that can be “automatically assembled out of the advertiser-produced creative components” so they can fit in a variety of online spaces.[230] This means that “digital ads can be dynamic” in a way that “static” broadcast advertising cannot.[231] Google also argued for flexible solutions, such as mandating that all digital advertisements include a “notice of who is responsible for the ad.”[232]
Furthermore, Google endorsed strengthening laws to prevent foreign interference in elections by amending the Foreign National Ban in 52 U.S.C. § 30121 (2012).[233] Google suggested that Congress expand the term “electioneering communication” to include “communications placed for a fee on another person’s web site.”[234] This revision would ensure that the Foreign National Ban applies not only to broadcast, cable, and satellite, but also to the Internet.[235] Google also proposed that Congress clarify the meaning of “expenditure” in the Foreign National Ban to guarantee that no foreign national can purchase a communication intended to influence an election, “even if the communication does not contain express advocacy for or against a particular candidate.”[236]
In addition, Google recommended amending the Foreign Agents Registration Act to stop foreign interference in elections.[237] It suggested adding a disclaimer requirement for all informational material that a foreign agent distributes over the Internet.[238] In effect, foreign agents who purchased digital issue ads would need to include a disclaimer identifying themselves in the ad.[239] Moreover, Google advised that Congress could “require any foreign principal, whether [or] not acting through a registered agent in the US, to include a disclaimer identifying that the ad was distributed by or on behalf of the foreign principal to influence the US public.”[240]
3. Twitter
In its comment, Twitter emphasized that any new regulation requiring disclaimers should account for the 280-character limit of tweets.[241] According to Twitter, the typical disclaimer would constitute 35 percent of a tweet, and would “significantly alter the way Users engage with the platform.”[242] Thus, Twitter urged “the FEC to consider ways for character-constrained platforms to fulfill the public disclosure in ways that reflect such constraints while providing disclosure consistent with the product, and elsewhere through links.”[243] Twitter also stressed its own efforts to bring transparency to its platform with the introduction of the Transparency Center.[244]
C. Some Signs of FEC Action
After the comment period, the FEC voted unanimously 5-0 (Ravel’s seat remained vacant at the time) to initiate the rulemaking process for “disclaimers on paid internet and digital communications.”[245] In a motion favoring rulemaking, the Republican commissioners stated, “Foreign interference in U.S. elections is inimical to our nation’s interests and democratic values. The need to prevent such interference is an issue that transcends partisan politics, and on which all Americans can agree.”[246] But despite agreeing to move forward in the rulemaking process, the commissioners disagreed over when tech companies and experts should be brought in for the process. Democrats urged for a hearing sooner rather than later, while Republicans insisted on taking time to review the 100,000 comments submitted from the public on the matter first.[247] Ultimately, the FEC concluded it would draft a proposal and then invite the tech companies to comment.[248]
On October 31, 2017, the FEC published an opinion finding that the paid Facebook image and video ads must adhere to FECA’s disclaimer requirements.[249] This opinion was a response to an inquiry on “whether paid image and video ads on Facebook ‘must . . . include all, some, or none of the disclaimer information specified by 52 U.S.C. 30120(a).’”[250] But in reaching its conclusion, the commissioners disagreed in their rationales, none of which received the required four affirmative votes.[251]
Then, at the March 14, 2018 FEC meeting, the FEC announced a draft Notice of Proposed Rulemaking regarding disclaimer rules, voted to receive public comment on the proposal, and scheduled a public hearing on the proposal for June 27, 2018.[252] The March 26, 2018 Notice of Proposed Rulemaking requested comment on proposed revisions to the definition of “public communication” in 11 C.F.R. § 100.26 and on two alternative proposals regarding disclaimers.[253] As to revising the definition of “public communication,” the FEC proposed expanding the definition from “communications placed for a fee on another person’s website” to add “communications placed for a fee on another person’s ‘internet-enabled device or application.’”[254] The FEC hoped this revision would capture the shift in Internet activity “from blogging, websites, and listservs to social media networks (Facebook, Twitter, and LinkedIn), media sharing networks (YouTube, Instagram, and Snapchat), streaming applications (Netflix, Hulu),” “mobile devices and applications,” “augmented and virtual reality,” and the “Internet of Things.”[255]
For the disclaimer proposals, the FEC presented two alternatives: Alternatives A and B.[256] Alternative A suggested adopting current disclaimer requirements for radio and television communications and applying them to
“public communications distributed over the internet with audio or video components.”[257] It also recommended applying current printed publication disclosure requirements to “text and graphic public communications distributed over the internet.”[258] Certain small text or graphic public communications circulated over the Internet could fulfill the disclaimer requirements through an “adapted disclaimer.”[259] An adapted disclaimer would include an “abbreviated disclaimer on the face of the communication in conjunction with a technological mechanism that leads to a full disclaimer, rather than by providing a full disclaimer on the face of the communication itself.”[260]
Alternative B recommended creating a set of disclaimer requirements for Internet communications distinct from those governing traditional media.[261] Internet communications would have “clear and conspicuous” disclaimers that “meet the same general content requirement as other disclaimers, without imposing the additional disclaimer requirements that apply to print, radio, and television communications.”[262] Certain paid Internet advertisements would be able to fulfill disclaimer requirements with an adapted disclaimer, “depending on the amount of space or time necessary for a clear and conspicuous disclaimer as a percentage of the overall advertisement.”[263] If a paid Internet advertisement could not provide a disclaimer even through a technological mechanism, an exception to the disclaimer requirement could apply.[264]
In June 2018, the FEC held a two-day hearing on the March 26, 2018 Notice of Proposed Rulemaking after receiving over 165,000 public comments.[265] At the conclusion of the hearing, it remained unclear how the FEC would revise its disclaimer requirements or whether it would at all.[266] The FEC commissioners disagreed on how a disclaimer would appear on an advertisement on social media, how a disclaimer would be formatted, and whether the FEC could implement rules before the 2018 midterm elections.[267] Democratic Vice Chairwoman Ellen Weintraub, who has pushed for more transparency with online political ads, still believed that the FEC would be able to implement a new rule in time for the 2018 midterm elections.[268] But Republican Chairwoman Caroline Hunter considered it unlikely.[269] She indicated that the FEC need not rush to write a new rule: “No one wants to change the rules of a game when a game has already started.”[270]
The FEC has agreed in principle to introduce disclaimer requirements for online political advertisements, but disagreements over the details and timing continue to mire the partisan agency. The FEC’s failure to act means that Silicon Valley tech companies will continue to set the rules of the game—at least in the short term—for online political advertisements. Congress should step in and pass legislation, like the Honest Ads Act, to address this issue. But until it does, Silicon Valley will play a primary role in policing online political ads given the limits of current FEC regulations.
IV. Regulation and Its Limits
On October 19, 2017, a bipartisan trio of senators—Senators Amy Klobuchar, Mark Warner, and John McCain—introduced the Honest Ads Act (Act). Essentially, the Act amends and expands BCRA’s disclosure requirements for broadcast television and radio to cover online political ads. The Act’s disclosure requirements enhance the transparency of online political ads and prevent hostile foreign interference with elections.[271] Although some critics of campaign finance regulation may raise constitutional challenges to disclosure, courts have long regarded disclosure as an appropriate form of regulation. Nevertheless, regulation has its limits. Regulation may not be able to reach some of the most nefarious actors in the 2016 election, like bots and trolls disseminating divisive, viral online content that do not constitute ads. Moreover, regulation may not govern many online ads circulated during the 2016 election. Therefore, Congress should enact the Honest Ads Act, but recognize that it will still need to rely on Silicon Valley to protect the integrity of American elections.
A. A Legal Framework: Disclosure for Online Political Ads
The Act lays a foundation for a future legal framework governing online political ads. The Act can be broken down into four general elements. First, it limits the scope of the regulation to paid communication.[272] Second, it mandates disclaimers in online political ads to indicate who paid for the ad.[273] Third, it requires platforms to maintain a record of online political ads.[274] Fourth, it aims to prevent foreign nationals from interfering with elections.[275] The legislation answers many of the questions that the FEC has struggled with for years.
1. Elements of the Honest Ads Act
First, the Act focuses on regulating paid Internet communication by expanding the definitions of two key terms. It broadens the term “public communication” to encompass “paid Internet” or “paid digital communication.”[276] This amendment brings online political ads under the current regulatory regime. It also broadens the term “electioneering communication” to cover “qualified Internet or digital communication,” which means “any communication which is placed or promoted for a fee on an online platform.”[277]
Second, the Act mandates the inclusion of disclaimers in online political ads. It requires online political ads to have a disclaimer that states the name of the person who paid for the ad.[278] Ads must also provide platform users the ability to access legally required information “without receiving or viewing any additional material other than such required information.”[279]
Third, the Act requires the collection of records for online political ads. It directs online platforms with more than 50 million visitors or users to maintain a publicly accessible database of all ads placed by a person whose political ad purchases on the platform exceed $500 a year.[280] The records kept “would include a copy of the ad, the audience targeted, the views, and the time of first and last display, as well as the name and contact information of the purchaser.”[281]
Finally, the Act seeks to preempt foreign meddling in elections. It requires online platforms to “make reasonable efforts” to prevent a foreign national from purchasing an online advertisement directly or indirectly.[282]
2. The Necessity of the Act
The Act’s disclosure requirements are key to bringing transparency to online political advertising, especially if the FEC fails to overcome its partisan divide and is unable to promulgate a rule regulating online political ads. Since Buckley, the Supreme Court has upheld such disclosure requirements to ensure the integrity of elections. Even though Silicon Valley tech companies have voluntarily adopted the basic tenets of the Act, Congress should still pass the Act. Silicon Valley has implemented policies that bring more transparency to online political advertising, but these policies remain imperfect.
The Act’s required disclaimers play a significant role in promoting transparency and protecting the democratic process. Professor Yochai Benkler argues that disclaimers are necessary because people “assess the credibility of any statement in the context of what [they] think the agenda of the speaker is.”[283] Disclaimers are especially important as online ads increasingly target individual users.[284] Unlike TV and newspaper ads, which are highly visible, targeted ads may not receive as much publicity or news coverage to correct misinformation or disinformation.[285] Disclaimers can provide “a baseline defense against messaging that is highly tailored” to manipulate people.[286]
However, as Professor Nathaniel Persily points out, disclaimers on a political ad can occasionally take the dubious form of “Paid for by Americans for America.”[287] The purchaser is able to hide its true identity “behind a pleasant sounding, patriotic name, and the main donors to such organizations are often difficult to discover.”[288] The expenditure “is, in a literal sense, unaccountable,” and the public cannot “hold the speakers to account for substance or tone.”[289] Still, disclaimers are a starting point and at least alert users to stay vigilant when they see a disclaimer. As the Court stated in Citizens United, “At the very least, [disclaimer and disclosure requirements can] avoid confusion by making clear that the ads are not funded by a candidate or political party.”[290]
Mandating that online platforms establish an archive of online political ads allows the public to oversee and keep campaigns accountable. According to Benkler, online platforms already collect the data the Act requires, and the cost of establishing a public database “is incrementally trivial by comparison to the investments these companies have made in developing their advertising base and their capacities to deliver viewers to advertisers.”[291] A publicly accessible database “would allow campaigns to be each other’s watchdogs—keeping each other somewhat more honest and constrained.”[292] Moreover, the general public, journalists, and nonprofits would be able to monitor campaigns, report on their practices, recognize foreign interference, and help uncover manipulators of public opinion.[293] In short, this significant provision would provide “near-real time accountability for lies and manipulation.”[294]
Some critics believe archiving every promoted tweet or Facebook ad for public search would require a massive amount of work that may not lead to greater transparency.[295] As one journalist commented, “It’s difficult to imagine Joe Public digging through this vast archive of at least tens of thousands of messages, analyzing targeting patterns.”[296] Nevertheless, journalists and the public can still rely on archives to locate suspicious ads. Leading up to the 2018 midterm elections, journalists called on the public to report suspicious ads to their periodicals and highlighted Facebook’s searchable archive.[297]
The Supreme Court has long upheld such disclosure requirements from Buckley in 1976 to Citizens United in 2010. While mandatory disclosure can substantially infringe the exercise of First Amendment rights, disclosure requirements serve three governmental interests: (1) providing information to voters, (2) discouraging corruption, and (3) facilitating enforcement of campaign laws.[298] In 2010, the Court upheld the same disclosure requirements specified in the Honest Ads Act but for broadcast television and radio under BCRA, stating that such requirements “provide the electorate with information and ‘insure that the voters are fully informed’ about the person or group speaking.”[299]
In May 2018, Facebook, Google, and Twitter began to implement their own proposed disclosure rules to promote transparency on their online platforms despite FEC and congressional inaction.[300] All three voluntarily adopted the basic tenets of the Honest Ads Act. They now require political ads to include “Paid for by” disclaimers, host searchable archives of political ads, and require advertisers to verify their identity and location.[301]
Even though Silicon Valley tech companies have voluntarily adopted the basic tenets of the Act, their early efforts at self-policing have been flawed, underscoring the need for regulation. Facebook is a case in point. In late October 2018, Vice News and Business Insider tested Facebook’s political advertising rules and exposed its flaws.[302] In one test, Vice News posed as 100 senators to buy and run political ads on Facebook, “including ads ‘Paid for by’ by Mitch McConnell and Chuck Schumer.”[303] Facebook approved every ad, demonstrating that “anyone can buy an ad identified as ‘Paid for by’ by a major U.S. politician.”[304] Furthermore, Facebook approved these ads “to be shared from pages for fake political groups . . . .”[305] These vulnerabilities in Facebook’s political advertising system showcase that self-regulation alone is unlikely to bring greater transparency and protect elections from foreign meddling. Facebook may not be fit enough to police itself.
Thus, in the absence of effective FEC regulations, the Act is necessary to bring greater transparency to these online platforms and deter foreign activity in elections. With the Act, the government can enforce these disclosure requirements by taking legal action and punishing violators.[306] Furthermore, “not all Internet platforms are likely to agree to a common set of reforms” or voluntarily adopt the basic tenets of the Act.[307] Regulation rather than self-regulation is needed to maintain a level playing field for all online platforms and ensure that they adhere to disclosure requirements. However, as noted below, regulating online political ads alone is likely insufficient to uphold the integrity of elections. In some contexts, Washington, D.C. may still have to rely on Silicon Valley to provide accurate information to the electorate.
B. Limits to Regulation
Adopting the principles of the Honest Ads Act would better protect the integrity of future elections, but doing so is unlikely to serve as a complete defense. During the 2016 election, online political ads were merely one source of disinformation. Russia deployed troll armies—teams of people who would use social media accounts to harass people online and disseminate disinformation—to divide the electorate and undermine the 2016 election.[308] Bots spread “fake news” across social media websites. The Act may reach bots and trolls in some respects, but may not limit their impact entirely. Moreover, many political ads fall outside the scope of the Act.
1. Trolls and Bots
An inherent problem of social media networks is their vulnerability “to coordinated efforts,” which trolls, bots, paid influencers, or others can orchestrate to spread information or disinformation online.[309] The Internet Research Agency, a Kremlin-linked troll farm in Russia, employed trolls to create fake accounts on Facebook and Twitter to disseminate disinformation during the presidential campaign and agitate users online with their comments.[310] At a press conference in early 2017, Senator Warner stated that Russia hired 1,000 trolls to generate anti-Clinton fake news in swing states during the 2016 presidential campaign.[311] In late 2017, Facebook revealed that the Internet Research Agency purchased over $100,000 worth of ads, which were linked to 470 fake accounts and Pages.[312]
Bots can serve beneficial roles like news delivery, but they can also engage in nefarious activity like harassment.[313] On social media, they can “rapidly deploy messages, replicate themselves, and pass as human users.”[314] Bots can form social media networks known as “botnets,” which are frequently hundreds of “automated accounts built to follow and re-message one another.”[315] One person with a computer can control these botnets without revealing their identity or geographic location.[316] Increasingly, political actors and governments employ people and use bots to manipulate and influence public debate.[317]
During the 2016 election, political bot activity reached new heights.[318] One study found that “[n]ot only did the pace of highly automated pro-Trump activity increase over time, but the gap between highly automated pro-Trump and pro-Clinton activity widened from 4:1 during the first debate to 5:1 by election day.”[319] It concluded that the deployment of bots on Twitter was “deliberate and strategic . . . most clearly with pro-Trump campaigners and programmers who carefully adjusted the timing of content production during the debates, strategically colonized pro-Clinton hashtags, and then disabled activities after Election Day.”[320]
The dynamics of group behavior amplify the influence bots can have on elections. Philip Howard, who leads the Computational Propaganda Project at Oxford University, has observed that neighbors strongly influence a voter’s thinking.[321] So, when a bot floods Twitter with fake news at key moments, as happened during the 2016 election, the image of fake crowds can generate a false sense of solidarity.[322] In one of Howard’s studies examining news sharing on Twitter the week before the presidential election, he found that “[j]unk news . . . was just as, if not more, prevalent than the amount of information produced by professional news organizations.”[323] Howard believes bots created this flood of fake news.[324]
Silicon Valley tech companies have started to serve as more stringent censors, removing or suspending dubious accounts suspected of engaging in disinformation campaigns.[325] In summer 2018, Facebook identified and purged hundreds of Pages and fake accounts engaged in spreading disinformation originating from Russia and Iran.[326] Google removed YouTube videos affiliated with an Iranian influence campaign.[327] In September and October 2018, Twitter removed 10,000 accounts posing as Democrats that tweeted messages to deter voting.[328] Days before the 2018 midterm elections, Facebook “blocked more than 100 Facebook and Instagram accounts,” suspecting they were associated with the Internet Research Agency.[329]
The Honest Ads Act could help crack down on trolls and bots. Indeed, one could interpret the provisions of the Act to govern trolls and bots. The Act’s definition of “qualified Internet or digital communication” is “any communication which is placed or promoted for a fee on an online platform.”[330] This definition could cover any troll that someone employed or bot that someone purchased to spread a viral political message, and the message would have to include a disclaimer showing the source of the payment.[331] Given that the purpose of coordinated campaigns with trolls and bots is to give “the false impression that the views expressed are expressed authentically in the target Facebook or Twitter community, the burden on expression is no greater than the burden on any political advertiser who would have preferred to communicate without being clearly labeled as political advertising.”[332] The disclaimer would simply correct a misrepresentation.[333]
However, even with the Honest Ads Act, it seems feasible that a foreign agent employing a troll could still purchase an online ad. Leonid Bershidsky explains that a foreign agent could enter the US, purchase US SIM cards from a store, establish servers in the US, and use virtual private networks.[334] Then the agent could use a PayPal account to pay for a Facebook ad.[335] To verify residency, the agent could submit “electronic copies of a store loyalty card and a piece of mail.”[336] Unlike Facebook, Twitter permits more than one account for a person or pseudonyms, making it even easier for trolls.[337] Even if Congress passed the Honest Ads Act, “a troll cleverly disguised as Jane Doe or John Smith, and ostensibly based in Random Location on Google Maps, U.S.A.,” could purchase and run a political ad from Russia.[338] Bershidsky writes, “The transaction will be clearly recorded under the fake name and stored in a vast archive in which no one but a dedicated investigator will be able to find anything of value.”[339] Ultimately, with the Honest Ads Act, dedicated journalists, communication researchers, government officials, and citizens may have to track and investigate ads carefully to connect them to trolls and bots, even if the ads carry disclaimers.
2. Identifying and Defining an Ad
Many ads could fall outside the ambit of the Honest Ads Act. Prior to the introduction of the Honest Ads Act, Facebook informed congressional aides that it would be too difficult to identify a political ad.[340] Given that candidates frequently switch their message, and given the high volume of ads on Facebook, Facebook engineers would struggle to develop a process to recognize political ads and distinguish them from commercial ads.[341]
Furthermore, the FEC does not regulate issue ads, but only ads that support or oppose a specific candidate and ads that refer to a candidate leading up to an election.[342] When Facebook disclosed that the Internet Research Agency purchased $100,000 worth of ads, it stated that most of the 3,000 ads purchased did not refer to candidates but presented “divisive social issues.”[343] Thus, these issue ads would not fall under the purview of the Honest Ads Act or current FEC regulations because they did not clearly mention a candidate or the election.[344]
Conclusion
A 2017 study from media tracker Borrell Associates found that while political advertising expenditures increased by roughly 4 percent between 2012 and 2016, online advertising jumped 789 percent in the same time frame.[345] Lawmakers and regulators can no longer ignore the growing presence and danger of online political advertising. As the 2016 presidential election demonstrated, the libertarian regulatory environment for online political communication is ill equipped to defend online platforms against foreign actors, trolls, and bots that seek to spread divisive messages. Now, Facebook, Google, and Twitter have revised their policies to cope with the aftermath, but much more still needs to be done.
As the FEC begins the rulemaking process for Internet communication disclaimers and Congress deliberates on the Honest Ads Act, regulation may ensue. Enacting regulation would be an important step in increasing transparency and protecting the integrity of future elections. But lawmakers and regulators likely will encounter problems drafting laws and regulations to reach elusive trolls and bots and to cover content outside the traditional scope of campaign finance regulation without infringing on the First Amendment. To root out these trolls and bots and to protect the integrity of future elections, the public will likely still need to rely on Silicon Valley, rather than Washington, D.C. alone.[346]
DOI: https://doi.org/10.15779/Z38C53F20R.
Copyright © 2019 California Law Review, Inc. California Law Review, Inc. (CLR) is a California nonprofit corporation. CLR and the authors are solely responsible for the content of their publications.
Brian Beyersdorf, J.D., University of California, Berkeley, School of Law, 2018; B.A., University of Notre Dame, 2009. I thank my colleagues in Professor Bertrall Ross’s 2017 Election Law Seminar and the editors of the California Law Review for their perceptive comments and guidance. To Jessica Beyersdorf: thank you always for your unwavering support.
- Internet Communications, 71 Fed. Reg. 18,589, 18,590 (Apr. 12, 2006) (to be codified at 11 C.F.R. pts. 100, 110, 114). ↑
- Mark Zuckerberg, Facebook (Sept. 27, 2017), https://www.facebook.com/zuck/posts/10104067130714241?pnref =story [https://perma.cc/XG4G-8ECV]. ↑
- Indictment at 3–4, United States v. Internet Research Agency, LLC, No. 1:18-cr-00032 (D.D.C. Feb. 16, 2018). ↑
- Id. at 19. ↑
- Id. at 4. ↑
- See id. at 20–23; Adrian Chen, What Mueller’s Indictment Reveals About Russia’s Internet Research Agency, New Yorker (Feb. 16, 2018), https://www.newyorker.com/news/news-desk/what-muellers-indictment-reveals-about-russias-internet-research-agency [https://perma.cc/T6J2-A93S]. ↑
- Cecilia Kang et al., Tech Executives Are Contrite About Election Meddling, but Make Few Promises on Capitol Hill, N.Y. Times (Oct. 31, 2017), https://www.nytimes.com/2017/10/31/us/ politics/facebook-twitter-google-hearings-congress.html [https://perma.cc/AY3D-8AZ7]. ↑
- Nicholas Fandos et al., House Intelligence Committee Releases Incendiary Russian Social Media Ads, N.Y. Times (Nov. 1, 2017), https://www.nytimes.com/2017/11/01/us/politics/russia-technology-facebook.html [https://perma.cc/7E7A-DJRB]. ↑
- Craig Timberg et al., Russian Ads, Now Publicly Released, Show Sophistication of Influence Campaign, Wash. Post (Nov. 1, 2017), https://www.washingtonpost.com/business/technology/russian-ads-now-publicly-released-show-sophistication-of-influence-campaign/2017/11/01/d26aead2-bf1b-11e7-8444-a0d4f04b89eb_story.html?utm_term=.904e2e2418b8 [https://perma.cc/9DM7-JL7Q]. ↑
- Kenneth P. Vogel & Cecilia Kang, Senators Demand Online Ad Disclosures as Tech Lobby Mobilizes, N.Y. Times (Oct. 19, 2017), https://www.nytimes.com/2017/10/19/us/politics/facebook-google-russia-meddling-disclosure.html [https://perma.cc/F2VK-KL9Y]. ↑
- Jenn Topper, Everything You Need to Know About Political Ads, Sunlight Foundation (Nov. 10, 2015), https://sunlightfoundation.com/2015/11/10/everything-you-need-to-know-about-political-ads [https://perma.cc/MKS6-NRLR]. ↑
- Lawrence Norden & Ian Vandewalker, This Bill Would Help Stop Russia From Buying Online Election Ads, Slate (Oct. 19, 2017), https://slate.com/technology/2017/10/the-honest-ads-act-would-help-stop-online-election-meddling-from-foreign-governments.html [https://perma.cc/7PZR-5MCD]. ↑
- Nathaniel Persily, The Coming Revolution in Campaign Communication, Sacramento Bee (May 30, 2015), http://www.sacbee.com/opinion/california-forum/article22581321.html [https://perma.cc/7H4U-9MKM]. ↑
- See Vogel & Kang, supra note 10. ↑
- Fandos et al., supra note 8. ↑
- Scott Shane, These Are the Ads Russia Bought on Facebook in 2016, N.Y. Times (Nov. 1, 2017), https://www.nytimes.com/2017/11/01/us/politics/russia-2016-election-facebook.html [https://perma.cc/4XJQ-J8LF]. ↑
- Id. ↑
- Fandos et al., supra note 8. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Vogel & Kang, supra note 10. ↑
- Hamza Shaban, Election Officials Move Closer to Placing New Rules on Facebook and Google, Wash. Post (Nov. 16, 2017), https://www.washingtonpost.com/news/the-switch/wp/2017/11/16/election-officials-move-closer-to-placing-new-rules-on-facebook-and-google/?utm_term=.96151e8ed8d3 [https://perma.cc/YBR3-34K8]. ↑
- See, e.g., Michelle Ye Hee Lee, FEC Struggles to Craft New Rules for Political Ads in the Digital Space, Wash. Post (June 28, 2018), https://www.washingtonpost.com/politics/fec-struggles-to-craft-new-rules-for-political-ads-in-the-digital-space/2018/06/28/c749a234-7af9-11e8-aeee-4d04c8ac6158_story.html [https://perma.cc/H387-NXNR] (discussing the partisan divide among FEC Commissioners regarding online political advertisements); Tony Romm, Senate Majority Leader Mitch McConnell Said Tech Should Cooperate With Law Enforcement—and Help the U.S. Fight Russia, Recode (Nov. 4, 2017), https://www.recode.net/2017/11/4/16606364/senate-majority-leader-mitch-mcconnell-facebook-google-twitter-russia [https://perma.cc/YBR3-34K8] (noting that Senate Majority Leader Mitch McConnell is dubious of reforming any laws governing online political advertisements). ↑
- See Citizens United v. FEC, 558 U.S. 310, 367 (2010) (upholding the Bipartisan Campaign Reform Act of 2002’s disclosure requirements for televised electioneering communications); Buckley v. Valeo, 424 U.S. 1, 84 (1976) (upholding the Federal Election Campaign Act of 1971’s disclosure requirements for contributions and expenditures). ↑
- Richard L. Hasen, Chill Out: A Qualified Defense of Campaign Finance Disclosure Laws in the Internet Age, 27 J.L. & Pol. 557, 559 (2012). ↑
- Issie Lapowsky, Congress’s New Bill Can’t Eliminate Russian Influence Online, Wired (Oct. 19, 2017), https://www.wired.com/story/congresss-new-bill-cant-eliminate-russian-influence-online [https://perma.cc/ZH89-JQMW]. ↑
- Nathaniel Persily, The Campaign Revolution Will Not Be Televised, American Interest (Oct. 10, 2015), https://www.the-american-interest.com/2015/10/10/the-campaign-revolution-will-not-be-televised [https://perma.cc/6YSU-C7MB]. ↑
- Buckley, 424 U.S. at 61 (citing the Act of June 25, 1910). ↑
- Hasen, supra note 26, at 560. ↑
- Buckley, 424 U.S. at 84. ↑
- Id. at 66–68. ↑
- Citizens United v. FEC, 558 U.S. 310, 319 (2010). ↑
- Samuel Issacharoff et al., The Law of Democracy: Legal Structure of the Political Process 531 (5th ed. 2016). ↑
- Id. ↑
- Id. ↑
- Buckley, 424 U.S. at 84. ↑
- Id. at 63. ↑
- Id. at 63–64. ↑
- Id. at 64–66. ↑
- Id. at 66. ↑
- Id. at 66–67. ↑
- Id. at 67. ↑
- Id. ↑
- Id. ↑
- Id. at 67–68. ↑
- Id. at 73–74. ↑
- Id. at 69. ↑
- Id. at 74. ↑
- Brown v. Socialist Workers ‘74 Campaign Comm., 459 U.S. 87, 89–98 (1982); see Issacharoff et al., supra note 34, at 533. ↑
- Socialist Workers ‘74 Campaign Comm., 459 U.S. at 89–98; Citizens Against Rent Control v. City of Berkeley, 454 U.S. 290, 303 (1981); First Nat’l Bank of Bos. v. Bellotti, 435 U.S. 765, 792 n.32 (1978); Richard L. Hasen, The Surprisingly Complex Case for Disclosure of Contributions and Expenditures Funding Sham Issue Advocacy, 48 UCLA L. Rev. 265, 272 (2000) (citing FEC v. Mass. Citizens for Life, 479 U.S. 238, 251–56 (1986). ↑
- 514 U.S. 334 (1995). ↑
- Id. at 355–56. ↑
- Id. at 337. ↑
- Id. at 342. ↑
- Id. at 347. ↑
- Id. at 357. ↑
- Id. ↑
- Id. ↑
- Id. at 356. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- McConnell v. FEC, 540 U.S. 93, 126–27 (2003). ↑
- Id. at 126. ↑
- Id. ↑
- Id. ↑
- Buckley v. Valeo, 424 U.S. 1, 44 n.52 (1976). ↑
- McConnell, 540 U.S. at 126. ↑
- Id. ↑
- Id. at 126–27. ↑
- Issacharoff et al., supra note 34, at 484–85. ↑
- McConnell, 540 U.S. at 189; see also Issacharoff et al., supra note 34, at 484–85. ↑
- McConnell, 540 U.S. at 189–90 (quoting 2 U.S.C. § 434(f)(3)(A)(i)). ↑
- Issacharoff et al., supra note 35, at 487. ↑
- Id. at 487, 533. ↑
- Citizens United v. FEC, 558 U.S. 310, 366 (2010) (quoting 2 U.S.C. § 441d(d)(2) (current version at 52 U.S.C. § 30120(d)(2) (2012))). ↑
- Id. (quoting 2 U.S.C. § 441d(d)(2) (current version at 52 U.S.C. § 30120(d)(2) (2012))). ↑
- Id. (quoting 2 U.S.C. § 441d(a)(3) (current version at 52 U.S.C. § 30120(a)(3) (2012))). ↑
- Id. (citing 2 U.S.C. § 434(f)(1) (current version at 52 U.S.C. § 30104 (2012))). ↑
- Id. (citing 2 U.S.C. § 434(f)(2) (current version at 52 U.S.C. § 30104 (2012))). ↑
- McConnell v. FEC, 540 U.S. 93, 197 (2003) (quoting McConnell v. FEC, 251 F. Supp. 2d 176, 237 (D.C. Cir. 2003)). ↑
- Id. at 196. ↑
- Id. at 197 (quoting McConnell, 251 F. Supp. 2d at 237). ↑
- Citizens United v. FEC, 558 U.S. 310, 319–20 (2010). ↑
- Id. at 320. ↑
- Id. at 321. ↑
- Id. at 368. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. at 369. ↑
- Id. at 370. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Persily, supra note 28. ↑
- 52 U.S.C. § 30120(a) (2012); Communications; Advertising; Disclaimers, 11 C.F.R. § 110.11(a) (2018). ↑
- 11 C.F.R. § 110.11(a)(1); Public Communication, 11 C.F.R. § 100.26 (2018). ↑
- Persily, supra note 28. ↑
- 52 U.S.C. § 30120(a); 11 C.F.R. § 110.11(a)(1). ↑
- 52 U.S.C. § 30101(22); 11 C.F.R. § 100.26. ↑
- 11 C.F.R. § 100.26. ↑
- Internet Communications, 71 Fed. Reg. 18,589, 18,593–94 (Apr. 12, 2006) (to be codified at 11 C.F.R. pts. 100, 110, 114). ↑
- 11 C.F.R. § 110.11(b)(1); 52 U.S.C. § 30120(a)(1). ↑
- 11 C.F.R. § 110.11(b)(2); 52 U.S.C. § 30120(a)(2). ↑
- 11 C.F.R. § 110.11(b)(3); 52 U.S.C. § 30120(a)(3). ↑
- 11 C.F.R. § 110.11(f)(1)(i). ↑
- 11 C.F.R. § 110.11(f)(l)(ii). ↑
- For a historical summary of relevant rulemakings and advisory opinions, see Internet Communication Disclaimers and Definition of “Public Communication,” 83 Fed. Reg. 12,864, 12,866–68. (proposed Mar. 26, 2018) (to be codified at 11 C.F.R. pts. 100, 110). ↑
- Persily, supra note 28. ↑
- Internet Communications, 71 Fed. Reg. 18,589, 18,591 (Apr. 12, 2006) (to be codified at 11 C.F.R. pts. 100, 110, 114). ↑
- See 52 U.S.C. § 30101(22) (2012). ↑
- Internet Communications, 71 Fed. Reg. at 18,600. ↑
- Id. at 18,589 (citing Shays v. FEC, 337 F. Supp. 2d 28 (D.D.C. 2004)). ↑
- Id. ↑
- Vogel & Kang, supra note 10. ↑
- Public Communication, 11 C.F.R. § 100.26 (2018). ↑
- Internet Communications, 71 Fed. Reg. at 18,594. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- See Communications; Advertising; Disclaimers, 11 C.F.R. § 110.11(a)(1) (2018) (defining the scope of disclaimers for public communications). ↑
- Internet Communications, 71 Fed. Reg. at 18,589. ↑
- Id. at 18,589–90. ↑
- Id. at 18,593. ↑
- Id. at 18,589. ↑
- FEC, Certification of Google Advisory Opinion Letter, Advisory Op. No. 2010-19, at 2 (Oct. 8, 2010), https://www.fec.gov/files/legal/aos/76083.pdf [https://perma.cc/UJL3-9YMH] [hereinafter Google Advisory Opinion Letter]. ↑
- Id. at 1. ↑
- Id. ↑
- Id. ↑
- Id. at 2. ↑
- Id. ↑
- Id. The FEC has six members. 52 U.S.C. § 30106(a)(1) (2012). No more than three members may belong to the same political party. Id. Four affirmative votes are necessary to issue an advisory opinion. Id. § 30106(c); see also Issuance of Advisory Opinions, 11 C.F.R. § 112.4(a). At the moment, the FEC only has four commissioners, the bare-minimum needed for a quorum. Michelle Ye Hee Lee, FEC Commissioner’s Departure Leaves Panel with Bare-Minimum Quorum, Wash. Post (Feb. 7, 2018), https://www.washingtonpost.com/politics/fec-commissioners-departure-leaves-panel-with-bare-minimum-quorum/2018/02/07/03fb24a0-0c28-11e8-8890-372e2047c935_story.html [https://perma.cc/C2EX-TUDC]. ↑
- See Google Advisory Opinion Letter, supra note 129; FEC, Concurring Statement of Vice Chair Cynthia L. Bauerly, Commissioner Steven T. Walther, and Commissioner Ellen L. Weintraub, Certification of Google Advisory Opinion Letter, Advisory Op. No. 2010-19 (2010), https://www.fec.gov/files/legal/aos/76087.pdf [https://perma.cc/V5K9-9SZV] [hereinafter Bauerly et al., Concurring Statement]. ↑
- Bauerly et. al., Concurring Statement, supra note 136, at 3. ↑
- Id. ↑
- FEC, Concurring Statement of Chairman Matthew S. Petersen, Certification of Google Advisory Opinion Letter, Advisory Op. No. 2010-19 (2010), http://saos.fec.gov/aodocs/1160122.pdf [https://perma.cc/4A6G-NV6S]. ↑
- FEC, Request by Facebook, Advisory Op. No. 2011-09, at 8 (Apr. 26, 2011), https://www.fec.gov/files/legal/aos/77149.pdf [https://perma.cc/MRA9-FQU7]. ↑
- Id. at 1. ↑
- Id. at 6–7. ↑
- Id. at 6. ↑
- Id. ↑
- Id. ↑
- Id. at 7. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Letter from Rosemary C. Smith, FEC Associate General Counsel, to Perkins Coie LLP (June 15, 2011), https://www.fec.gov/files/legal/aos/77163.pdf [https://perma.cc/STR4-CYGP] (regarding Advisory Op. No. 2011-09). ↑
- FEC, Draft C of Advisory Op. No. 2011-09 (June 15, 2011), https://www.fec.gov/files/legal/aos/77162.pdf [https://perma.cc/AC7U-77AY]. ↑
- Id. at 6. ↑
- Id. ↑
- Id. at 7. ↑
- Id. at 8. ↑
- FEC, Draft B of Advisory Op. No. 2011-09, at 1 (June 15, 2011), https://www.fec.gov/files/legal/aos/77152.pdf [https://perma.cc/2QUU-Q6T2]. ↑
- Id. at 4. ↑
- Id. at 5. ↑
- Id. ↑
- Vogel & Kang, supra note 10. ↑
- Internet Communication Disclaimers, 76 Fed. Reg. 63,567, 63,568 (proposed Oct. 13, 2011) (to be codified at 11 C.F.R. pt. 110). ↑
- Id. at 63,567. ↑
- Internet Communication Disclaimers; Reopening of Comment Period and Notice of Hearing, 81 Fed. Reg. 71,647 (proposed Oct. 18, 2016) (to be codified at 11 C.F.R. pt. 110). ↑
- Letter from Colin S. Stretch, Facebook Deputy General Counsel, to Amy L. Rothstein, FEC Assistant General Counsel (Nov. 14, 2011), http://sers.fec.gov/fosers/showpdf.htm?docid=98769 [https://perma.cc/2WBR-WBLQ]. ↑
- See Ann Ravel, How the FEC Turned a Blind Eye to Foreign Meddling, Politico Magazine (Sept. 18, 2017), http://www.politico.com/magazine/story/2017/09/18/fec-foreign-meddling-russia-facebook-215619 [https://perma.cc/2BWP-VT9K]. ↑
- Id. ↑
- Id.; see John Diaz, Russians Exploited Our Weak Laws on Online Political Ads, S.F. Chron. (Sept. 22, 2017), https://www.sfchronicle.com/opinion/diaz/article/Russians-exploited-our-weak-laws-on-online-12222423.php [https://perma.cc/LE25-ECLD]. ↑
- Ravel, supra note 166. ↑
- See Alex Richardson, The Comment Was Sponsored By . . . , Slate (June 20, 2016), http://www.slate.com/articles/technology/future_tense/2016/06/the_fec_can_t_figure_out_what_to_do_about_paid_speech_online.html [https://perma.cc/XX5P-K9PP]. ↑
- Checks & Balances for Econ. Growth, FEC, MUR 6729, at 2 (Oct. 24, 2014), https://www.fec.gov/files/legal/murs/current/104790.pdf [https://perma.cc/6K2Q-RRLF] (Statement of Reasons of Chairman Lee E. Goodman and Comm’rs Caroline C. Hunter and Matthew S. Petersen). ↑
- See id., http://eqs.fec.gov/eqsdocsMUR/14044363872.pdf [https://perma.cc/BC7X-27SQ] (Statement of Reasons of Vice Chair Ann M. Ravel). ↑
- Id. at 1. ↑
- Id. at 2. ↑
- Ravel, supra note 166. ↑
- Id. ↑
- See Richardson, supra note 170. ↑
- Id. ↑
- Internet Communication Disclaimers; Reopening of Comment Period and Notice of Hearing, 81 Fed. Reg. 71,647, 71,647 (proposed Oct. 18, 2016) (to be codified at 11 C.F.R. pt. 110). ↑
- See id. ↑
- Id.; Letter from Adav Noti, FEC Acting Associate General Counsel, to Joseph Sandler et al., Sandler, Reiff, Young & Lamb, P.C. (Feb. 27, 2014), https://www.fec.gov/files/legal/aos/2013-18/2013-18.pdf [https://perma.cc/ZKF7-K2V8]. ↑
- Internet Communication Disclaimers; Reopening of Comment Period and Notice of Hearing, 81 Fed. Reg. at 71,647; Letter from Mark Allen, FEC Assistant General Counsel, to Raymond Schamis (Mar, 3, 2016), https://fec-dev-proxy.app.cloud.gov/files/legal/murs/6911/16044390442.pdf [https://perma.cc/HK28-2A77]. ↑
- Internet Communication Disclaimers; Reopening of Comment Period, 82 Fed. Reg. 46,937, 46,938 (proposed Oct. 10, 2017) (to be codified at 11 C.F.R. pt. 110). ↑
- Eric Lichtblau, Democratic Member to Quit Election Commission, Setting up Political Fight, N.Y. Times (Feb. 19, 2017), https://www.nytimes.com/2017/02/19/us/politics/fec-elections-ann-ravel-campaign-finance.html [https://perma.cc/9HFU-NEA4]. ↑
- Ann Ravel Resigns Her Seat on the Federal Election Commission, NPR (Feb. 21, 2017), https://www.npr.org/2017/02/21/516375469/ann-ravel-resigns-her-seat-on-the-federal-election-commission [https://perma.cc/56S9-WUEN]. ↑
- See id. ↑
- Id. ↑
- See id. ↑
- Id. ↑
- Id. ↑
- See Lichtblau, supra note 184. ↑
- Id. ↑
- See Persily, supra note 28. ↑
- Id. ↑
- See Zuckerberg, supra note 2. ↑
- Id. ↑
- Id. ↑
- Fandos et al., supra note 8. ↑
- See Rob Goldman, Update on Our Advertising Transparency and Authenticity Efforts, Facebook Newsroom (Oct. 27, 2017), https://newsroom.fb.com/news/2017/10/update-on-our-advertising-transparency-and-authenticity-efforts [https://perma.cc/YXZ5-TXZQ]. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Rob Goldman, Making Ads and Pages More Transparent, Facebook Newsroom (Apr. 6, 2018), https://newsroom.fb.com/news/2018/04/transparent-ads-and-pages [https://perma.cc/H7CW-G48T]. ↑
- Id. ↑
- Google, Comments of Google LLC re: Internet Communication Disclaimers (Nov. 9, 2017), http://sers.fec.gov/fosers/showpdf.htm?docid=358482 [https://perma.cc/5NVY-2WYZ]. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Bruce Falck, New Transparency for Ads on Twitter, Twitter Blog (Oct. 24, 2017), https://blog.twitter.com/official/en_us/topics/product/2017/New-Transparency-For-Ads-on-Twitter.html [https://perma.cc/4ATC-NXR6]. ↑
- Id. ↑
- Id. ↑
- See id. ↑
- See Internet Communication Disclaimers; Reopening of Comment Period, 82 Fed. Reg. 46,937, 46,938 (proposed Oct. 10, 2017) (to be codified at 11 C.F.R. 110). ↑
- Hamza Shaban, After Russian Meddling, Google and Facebook Shift Their Stance on a Crucial Issue for Voters, Wash. Post (Nov. 14, 2017), https://www.washingtonpost.com/news/the-switch/wp/2017/11/14/after-russian-meddling-google-and-facebook-shift-their-stance-on-a-crucial-issue-for-voters [https://perma.cc/P84L-7BG3]. ↑
- Facebook, Comment Letter on ANPRM, Internet Disclaimers, at 1 (Nov. 13, 2017), http://sers.fec.gov/fosers/showpdf.htm?docid=358468 [https://perma.cc/ZJH5-6J87]. ↑
- See, e.g., id. ↑
- Shaban, supra note 23. ↑
- Facebook, supra note 219, at 1. ↑
- Id. at 2. ↑
- Id. at 2–3. ↑
- Id. at 3. ↑
- Id. at 4–5. ↑
- See Google, supra note 209. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- See id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Twitter, Comment Letter on ANPRM, Internet Communication Disclaimers, at 3 (Nov. 9, 2017), http://sers.fec.gov/fosers/showpdf.htm?docid=358496 [https://perma.cc/8C7W-XP65]. ↑
- Id. ↑
- Id. ↑
- Id. at 3–4. ↑
- Vote Certification, Motion for Notice of Proposed Rulemaking on Disclaimers on Paid Internet and Digital Communications (Nov. 16, 2017), http://sers.fec.gov/fosers/showpdf.htm?docid=364653 [https://perma.cc/MRE2-VNWU]. ↑
- Memorandum from Vice Chair Hunter and Commissioners Goodman and Petersen, FEC Motion for Notice of Proposed Rulemaking on Disclaimers on Paid Internet and Digital Communications (Nov. 15, 2017), http://sers.fec.gov/fosers/showpdf.htm?docid=359616 [https://perma.cc/5GBK-2692]. ↑
- Shaban, supra note 23. ↑
- Id. ↑
- Internet Communication Disclaimers and Definition of “Public Communication,” 83 Fed. Reg. 12,864, 12,868 (proposed Mar. 26, 2018) (to be codified at 11 C.F.R. pts. 100 and 110). ↑
- Id. (quoting Advisory Opinion Request at 4, Advisory Opinion 2017-12 (Take Back Action Fund) (Oct. 31, 2017)). ↑
- Id. ↑
- Michelle Ye Hee Lee & Tony Romm, FEC Considers Expanding Political Ad Disclaimers to Mobile Apps, Wash. Post (Mar. 14, 2018), https://www.washingtonpost.com/politics/fec-considers-expanding-political-ad-disclaimers-to-mobile-apps/2018/03/14/0df362ae-27a5-11e8-bc72-077aa4dab9ef [https://perma.cc/2PB4-9VQ6]. ↑
- Internet Communication Disclaimers and Definition of “Public Communication,” 83 Fed. Reg. 12,864 at 12,864. ↑
- Id. at 12,868. ↑
- Id. ↑
- Id. at 12,869. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. at 12,874. ↑
- Id. at 12,869. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Michelle Ye Hee Lee, FEC Struggles to Craft New Rules for Political Ads in the Digital Space, Wash. Post (June 28, 2018), https://www.washingtonpost.com/politics/fec-struggles-to-craft-new-rules-for-political-ads-in-the-digital-space/2018/06/28/c749a234-7af9-11e8-aeee-4d04c8ac6158 [https://perma.cc/H387-NXNR]. ↑
- Suhauna Hussain, Will Federal Officials Further Regulate Online Political Ads? Too Soon to Tell, Center for Public Integrity (June 28, 2018), https://www.publicintegrity.org/2018/06/28/21908/fec-political-ads-online-russia-regulation-google-twitter-facebook [https://perma.cc/89XD-BYJ4]. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- See Honest Ads Act of 2017, S. 1989, 115th Cong. § 4 (2017). ↑
- Yochai Benkler, Election Advertising Disclosure: Part 1, Harv. L. Rev. Blog (Oct. 31, 2017), https://blog.harvardlawreview.org/election-advertising-disclosure-part-1 [https://perma.cc/H8MZ-UNAL] [hereinafter Election Advertising Disclosure: Part 1]. ↑
- Id. ↑
- Id. ↑
- S. 1989 § 9(c). ↑
- Id. § 5(a). ↑
- Id. § 6(a)(1)(A), 6(a)(1)(D). ↑
- Id. § 7(b)(1). ↑
- Id. ↑
- Id. § 8(a). ↑
- Benkler, Election Advertising Disclosure: Part 1, supra note 271. ↑
- S. 1989 § 9. ↑
- Benkler, Election Advertising Disclosure: Part 1, supra note 271. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Persily, supra note 28. ↑
- Id. ↑
- Id. ↑
- Citizens United v. FEC, 558 U.S. 310, 368 (2010); see also, Young Mie Kim et al., The Stealth Media? Groups and Targets Behind Divisive Campaigns on Facebook, 35 Political Commc’n 515 (2018), https://www.tandfonline.com/doi/full/10.1080/10584609.2018.1476425?scroll=top&needAccess=true [https://perma.cc/4U2M-JX9B] (“[A]dequate regulatory policies including disclosure and disclaimer requirements would, at the very least, provide the basis for public monitoring, research, and investigation.”). ↑
- Benkler, Election Advertising Disclosure: Part 1, supra note 271. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Leonid Bershidsky, Russian Trolls Would Love the ‘Honest Ads Act’, Bloomberg View (Oct. 20, 2017), https://www.bloomberg.com/view/articles/2017-10-20/russian-trolls-would-love-the-honest-ads-act [https://perma.cc/V6J9-Z8ZY]. ↑
- Id. ↑
- See, e.g., If You See Disinformation Ahead of the Midterms, We Want to Hear from You, N.Y. Times (Sept. 17, 2018), https://www.nytimes.com/2018/09/17/technology/disinformation-tipsheet.html [https://perma.cc/7WJ3-N2WN]; Caitlin Kelly, Help Wired Track How Political Ads Target You on Facebook, Wired (Oct. 12, 2018), https://www.wired.com/story/facebook-political-ad-collector-wired-propublica [https://perma.cc/U93Y-CYNE]. ↑
- Buckley v. Valeo, 424 U.S. 1, 66–68 (1976). ↑
- Honest Ads Act of 2017, S. 1989, 115th Cong. § 3(5) (2017) (quoting Citizens United v. FEC, 558 U.S. 310, 368 (2010) (citations omitted)). ↑
- Vijaya Gadde & Bruce Falck, Increasing Transparency for Political Campaigning Ads on Twitter, Twitter Blog (May 24, 2018), https://blog.twitter.com/official/en_us/topics/company/2018/Increasing-Transparency-for-Political-Campaigning-Ads-on-Twitter.html [https://perma.cc/PU3M-Z29S]; Rob Leathern, Shining a Light on Ads with Political Content, Facebook Newsroom (May 24, 2018), https://newsroom.fb.com/news/2018/05/ads-with-political-content [https://perma.cc/AJ96-RCPQ]; Kent Walker, Supporting Election Integrity Through Greater Advertising Transparency, Google Blog (May 4, 2018), https://blog.google/outreach-initiatives/public-policy/supporting-election-integrity-through-greater-advertising-transparency [https://perma.cc/FJD7-3J7U]. ↑
- Supra note 299; Michee Smith, Introducing a New Transparency Report for Political Ads, Google Blog (Aug. 15, 2018), https://blog.google/technology/ads/introducing-new-transparency-report-political-ads [https://perma.cc/W3CC-GTA4]. ↑
- Shona Ghosh, We Ran 2 Fake Ads Pretending to be Cambridge Analytica—and Facebook Failed to Catch that They Were Frauds, Business Insider (Oct. 31, 2018), https://www.businessinsider.com/facebook-approved-political-ads-paid-for-by-cambridge-analytica-2018-10 [https://perma.cc/J726-BA9Q]; Makena Kelly, Without New Laws, Facebook Has No Reason to Fix Its Broken Ad System, Verge (Oct. 31, 2018), https://www.theverge.com/2018/10/31/18048236/facebook-mark-warner-fec-congress-honest-ads-act [https://perma.cc/ME8N-C8VS]; William Turton, Facebook’s Political Ad Tool Let Us Buy Ads ‘Paid For’ by Mike Pence and ISIS, Vice News (Oct. 25, 2018), https://news.vice.com/en_us/article/wj9mny/facebooks-political-ad-tool-let-us-buy-ads-paid-for-by-mike-pence-and-isis [https://perma.cc/N6SZ-CMGC]; William Turton, We Posed as 100 Senators to Run Ads on Facebook. Facebook Approved All of Them., Vice News (Oct. 30, 2018), https://news.vice.com/en_us/article/xw9n3q/we-posed-as-100-senators-to-run-ads-on-facebook-facebook-approved-all-of-them [https://perma.cc/PBM9-KEVS] [hereinafter Turton, We Posed as 100 Senators to Run Ads on Facebook]. ↑
- William Turton, We Posed as 100 Senators to Run Ads on Facebook, supra note 301. ↑
- Id. ↑
- Id. ↑
- Dipayan Ghosh & Robert C. Pozen, Americans Deserve to Know Who’s Behind Online Political Ads, Fortune (Nov. 2, 2018), http://fortune.com/2018/11/02/russia-facebook-twitter-political-ads [https://perma.cc/A6XP-P3CD]. ↑
- Id. ↑
- See Indictment, supra note 3, at 3–4; Adrian Chen, The Agency, N.Y. Times Mag. (June 2, 2015), https://www.nytimes.com/2015/06/07/magazine/the-agency.html [https://perma.cc/9SEJ-UW4J]. ↑
- Yochai Benkler, Election Advertising Disclosure: Part 2, Harv. L. Rev. Blog (Nov. 3, 2017), https://blog.harvardlawreview.org/election-advertising-disclosure-part-2 [https://perma.cc/P9XL-KV7Y] [hereinafter Benkler, Election Advertising Disclosure: Part 2]. ↑
- Scott Shane & Vindu Goel, Fake Russian Facebook Accounts Bought $100,000 in Political Ads, N.Y. Times (Sept. 6, 2017), https://www.nytimes.com/2017/09/06/technology/facebook-russian-political-ads.html [https://perma.cc/G4VF-PKUT]. ↑
- Rachel Roberts, Russia Hired 1,000 People to Create Anti-Clinton ‘Fake News’ in Key US States During Election, Trump-Russia Hearings Leader Reveals, Independent (Mar. 20, 2017), http://www.independent.co.uk/news/world/americas/us-politics/russian-trolls-hilary-clinton-fake-news-election-democrat-mark-warner-intelligence-committee-a7657641.html [https://perma.cc/DSS4-XEPT]. ↑
- Shane & Goel, supra note 309. ↑
- Bence Kollanyi, Philip N. Howard & Samuel C. Wooley, Computational Propaganda Project, Bots and Automation Over Twitter During the U.S. Election (2016), http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2016/11/Data-Memo-US-Election.pdf [https://perma.cc/LHX3-EULD]. ↑
- Id. ↑
- Id. ↑
- Nathaniel Persily, The 2016 U.S. Election: Can Democracy Survive the Internet?, 28 J. of Democracy 63, 70 (Apr. 2017). ↑
- Kollanyi, Howard & Wooley, supra note 312. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- See Tim Wu, Is the First Amendment Obsolete?, Knight First Amendment Institute (Sept. 2017), https://knightcolumbia.org/content/tim-wu-first-amendment-obsolete [https://perma.cc/K9MS-K64G]. ↑
- Id. ↑
- Jessica Kwong, Twitter Users Got More Fake News than Real News Before Trump Won Election, Newsweek (Sept. 28, 2017), https://www.newsweek.com/twitter-users-got-more-fake-news-real-news-trump-won-election-673720 [https://perma.cc/E3CK-TQNZ] (quoting Philip N. Howard, et al., Computational Propaganda Project, Social Media, News and Political Information during the US Election: Was Polarizing Content Concentrated in Swing States? (Sept. 28, 2017), http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/09/Polarizing-Content-and-Swing-States.pdf) [https://perma.cc/SP23-BQ97]. ↑
- Wu, supra note 320. ↑
- See, e.g., Removing Bad Actors on Facebook, Facebook Newsroom (July 31, 2018), https://newsroom.fb.com/news/2018/07/removing-bad-actors-on-facebook [https://perma.cc/F6UN-PENX]; Taking Down More Coordinated Inauthentic Behavior, Facebook Newsroom (Aug. 21, 2018), https://newsroom.fb.com/news/2018/08/more-coordinated-inauthentic-behavior [https://perma.cc/WU63-8TMU]. ↑
- Nicholas Fandos & Kevin Roose, Facebook Identifies an Active Political Influence Campaign Using Fake Accounts, N.Y. Times (July 31, 2018), https://www.nytimes.com/2018/07/31/us/politics/facebook-political-campaign-midterms.html [https://perma.cc/4Y4D-8QJE]; Sheera Frenkel & Nicholas Fandos, Facebook Identifies New Influence Operations Spanning Globe, N.Y. Times (Aug. 21, 2018), https://www.nytimes.com/2018/08/21/technology/facebook-political-influence-midterms.html?action=click&module=inline&pgtype=Homepage [https://perma.cc/2E8B-CZUL]. ↑
- Daisuke Wakabayashi, Google Deletes 39 YouTube Channels Linked to Iranian Influence Operation, N.Y. Times (Aug. 23, 2018), https://www.nytimes.com/2018/08/23/technology/google-youtube-iranian-influence.html?smid=nytcore-ios-share [https://perma.cc/JJY9-UFQD]. ↑
- Kate Conger & Adam Satariano, Twitter Says It Is Ready for the Midterms, but Rogue Accounts Aren’t Letting Up, N.Y. Times (Nov. 5, 2018), https://www.nytimes.com/2018/11/05/technology/twitter-fake-news-midterm-elections.html [https://perma.cc/PW2K-YMTK]. ↑
- Sheera Frenkel & Mike Isaac, Russian Trolls Were at It Again Before Midterms, Facebook Says, N.Y. Times (Nov. 7, 2018), https://www.nytimes.com/2018/11/07/technology/facebook-russia-midterms.html [https://perma.cc/76NF-KXHL]. ↑
- Benkler, Election Advertising Disclosure: Part 2, supra note 308. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Bershidsky, supra note 294. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Id. ↑
- Vogel & Kang, supra note 10. ↑
- Id. ↑
- See, e.g., April Glaser, The FEC is Basically Useless, Slate (Oct. 4, 2017), http://www.slate.com/articles/news_and_politics/future_tense/2017/10/congress_needs_to_do_something_about_facebook_s_russian_political_ad_problem.html [https://perma.cc/56MQ-U3JF]; Emily Stewart, Banks Have to Know Their Customers. Shouldn’t Facebook and Twitter?, Vox (Mar. 19, 2018), https://www.vox.com/policy-and-politics/2018/3/19/17038130/facebook-twitter-russia-regulations-laws [https://perma.cc/8RLL-P6RH]. ↑
- Shane & Goel, supra note 309. ↑
- See id.; see generally Shane, supra note 16 (showing a sample of ads Russia purchased). ↑
- Facebook, supra note 219 (citing Kate Kaye, Data-Driven Targeting Creates Huge 2016 Political Ad Shift: Broadcast TV Down 20%, Cable and Digital Way Up, Ad Age (Jan. 3, 2017), http://adage.com/article/media/2016-political-broadcast-tv-spend-20-cable-52/307346 [https://perma.cc/5MM6-VBV5]). ↑
- See Persily, supra note 97. ↑