A few comments about the SAFE TECH Act aiming at reforming Section 230

U.S. Senators  Mark R. Warner (D-VA), Mazie Hirono (D-HI) and Amy Klobuchar (D-MN) introduced on February 5, 2021, the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act which aims at reforming Section 230 of the Communications Decency Act (CDA)  to, according to the press release, “allow social media companies to be held accountable for enabling cyber-stalking, targeted harassment, and discrimination on their platforms.”

Section 230 immunity would extend to speech, including commercial speech

The bill modifies the “26 words that created the internet,” Section 230 (c)(1)(A) of the CDA, striking down “information” and replacing it by “speech.” The reason for that change is not clear, as information is necessarily speech and “speech” is a broader concept  than “information” as it may encompass expressive conduct such as armband wearing or burning your draft card in public. What matters in Section 230, at least in its current version, is that a “provider or user of an interactive computer service” cannot be held accountable for third party content. This is why Twitter cannot be sued for defamatory speech. The move is probably an attempt to take “conduct” out of Section 230 immunity, but it can be argued that, by doing that, the bill extends the scope of the immunity, not reduces it.

In a case often mentioned by speakers in favor of Section 230 reform, the Supreme Court of Wisconsin dismissed in 2019, in Daniel v. Armlist, a complaint against a site who had sold a gun later used to perpetrate a mass shooting, killing four persons, including plaintiff’s mother. The Supreme Court of Wisconsin Court reversed the decision of the court of appeals and affirmed the circuit court’s dismissal of Plaintiff’s complaint as barred by Section 230(c)(1), because Plaintiff’s claim for relief required the arms-selling platform to be treated as the publisher or speaker of information posted by third parties on its site. The Wisconsin Supreme Court considered Armslist to be an interactive computer service provider under Section 230, but also described it as “a firearm advertising website.” Advertising is commercial speech, and it thus seems that the change made by the bill would not affect similar cases. Professors Danielle Keats Citron and Mary Anne Franks wrote in 2020 that Section 230 defenders “presum[e] that the internet is primarily, if not exclusively, a medium of speech,” and that this “presumption… should be interrogated” because the Internet is now used for all sorts of activities, from making diner reservations to looking for a soul mate, not merely as a “speech-machine.” So why did the drafters change “information” to “speech”? Time (and upcoming hearings) may tell…

The bill also creates a Section 230 (c)(1)(B), which would give defendants the burden of proving, by a preponderance of the evidence, that they are not publisher or speaker of third-party speech. As such, the immunity would be an affirmative defense. A social media platform should prove that it is more likely to be true than false that they are not the publisher or speaker of a particular third party “speech,” not merely “information.”

This reminds me of another bill aimed at modifying Section 230, the EARN ACT, named so because Section 230 immunity must be “earned.” Senator Blumenthal stated in his March 11, 2020 ’s opening statement in the hearing of the Senate Committee on the Judiciary about the bill that “immunity is not a right, it should be earned or deserved and certainly should not be continuing if these companies fail to observe and follow the basic moral obligations that they have under the law, but it is a matter of simple morality.” At least, the SAFE TECH Act would require that platforms prove they are not a publisher or speaker, not their morality… However, this change means that litigation about speech published on a platform is likely to become more costly.

The bill would make it more difficult for small companies to enter the market

Senator Warner posted a FAQ page about the bill which affirms that concerns that the bill may “expos[e] small tech companies and startups to liability and increased litigation costs” and thus “drive them out of business and simply entrench the dominant players” are “gravely exaggerated.” Why? Because plaintiffs are less likely to sue knowing that small companies are less likely to be able to pay the damages. This argument is weak and does not consider that small companies and startups will consider this new risk as a liability and may thus decide not to enter the market or create a new service. Mark Weinstein, founder and CEO of the social media platform MeWe, wrote an opinion piece in the Wall Street Journal last month arguing that small platforms need the immunity of Section 230 to compete with the giant platforms.

Section 230 immunity would exclude commercial speech

It would appear at first glance that this extension of the scope of Section 230 (c)(1)(A) is an extension of providers and users of interactive computer services’ immunity, as the bill would extend the immunity for third-party content provided by Section 230 (c)(1)(A) of the CDA to speech. “Speech” is expression, not merely information, and encompasses commercial speech. While the First Amendment differentiates commercial speech from other types of speech, Section 230, as of now, does not differentiate between commercial and non-commercial speech.

However, should the bill become law, Section 230 (c )(1) immunity would no longer be afforded by the provider or user of an interactive computer service if they  have “accepted payment to make the speech available or, in whole or in part, created or funded the creation of the speech.” Twitter already offers a Promoted Tweets service, which allow advertisers to promote a particular tweet for a fee, to allow it “to reach a wider group of users or spark engagement from their existing followers.” Twitter is also considering offering subscription-based accounts which would offer premium services.

Social media platforms are not the only “providers or users of an interactive computer service” able to benefit from Section 230, and commercial platforms do to, with, however,  some exceptions. For instance, the Court of appeals for the Third Circuit recently held that Amazon is a “seller” under Pennsylvania law and thus the plaintiff’s negligence and strict liability claims against the platform were not barred by Section 230. However, plaintiff’s claims that Amazon failed to add a warning about the product, a dog’s leash which had severely injured plaintiff, was barred by Section 230.

According to Senator Warner’s press release, such a move is warranted so that Section 230 immunity no longer applies to ads or other paid content and thus  “ensuring that platforms cannot continue to profit as their services are used to target vulnerable consumers with ads enabling frauds and scams.” This is a rather heavy-handed way to address the issue of scams, as it has the potential of wiping out trusted business models, such as Etsy, a platform offering users a way to sell their products online, either homemade or “vintage,” who may be less keen to host sales of products they cannot thoroughly vet.

The bill was probably inspired by cases such as Herrick v. Grindr, where the Second Circuit Court of Appeals affirmed the Southern District Court of New York which had granted Grindr’s motion to dismiss.

However, the Second Circuit considered that “Herrick’s claims treat[ed] Grindr as the publisher or speaker of the offensive content,” even though “[a]t its core, § 230 bars lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions–such as deciding whether to publish, withdraw, postpone or alter content.” Content can be speech, or information, it does not matter.

The bill’s multiple carve-outs

Section 230, as updated by the SAFE TECH Act, would not bar plaintiffs to seek injunctive relief if they claim that an interactive computer service provider has failed to remove, or to restrict access, or prevent the dissemination “of material that is likely to cause irreparable harm.” Indeed, injunctive relief is a remedy used to prevent a damage which could cause irreparable harm. Courts consider several factors when deciding to grant such injunctions, such as the irreparable harm which may occur if the injunction is not granted, the likelihood of success on the merits, whether there are sufficiently serious questions going to the merits to make them fair grounds for litigation, and whether the balance of hardships tips decidedly in the movant’s favor.

Herrick v. Grindr may again have inspired the bill, which is alluded to in the press release. In this case, Plaintiff’s former boyfriend had allegedly created a false Grindr profile for Plaintiff, which led to Plaintiff being harassed by hundreds of men believing he had contacted them online to have sexual relations. United States District Judge Caproni from the Southern District of New York denied extending a Temporary Restraining Order, which is an injunctive relief, because,“[a]t this early stage, it appears likely to the Court that Section 230 bars many (if not all) of Plaintiff’s tort claim.” Plaintiff was seeking Grindr to be ordered to “immediately disable all impersonating profiles created under Plaintiff’s name or with identifying information relating to Plaintiff, Plaintiff’s photograph, address, phone number, email account or place of work.”

No effect on civil rights laws

The bill would have no effect on civil rights laws, antitrust laws, stalking or harassment or intimidation laws,  international human rights laws and wrongful death actions. The FAQ page explains that the SAFE TECH bill aims at finding a solution to protect victims of online abuse, particularly if they belong to minorities and other marginalized groups. The FAQ page also notes that “online harms spread to the real world,” in Charlottesville, Kenosha and in the U.S. Capitol, and that the bill would give victims “an opportunity to hold platforms accountable when their deliberate inaction or product design decisions produce real-world harm.”

The press release cited Professor Olivier Sylvain regarding companies provided a “free pass for enabling discriminatory conduct” by Section 230. Professor Sylvain argued in an article published in 2020 that “online intermediaries were avowedly laissez faire about user-generated content” from around 1995 to 2010, but that now “only a handful of firms control the ways in which the vast majority of information flows to  users around the world.” The SAFE TECH appears to have only these giant companies in mind. For Professor Sylvain, the language of Section 230 (c )(1) “was meant to head off judicial application of defamation law to online platforms.” However, the “blanket immunity” Section 230 provided has given them “license to disregard expressive conduct that perpetuates or deepens harms for which policymakers have drafted legal protections.” This view influenced the drafter of the bill and the press release explains that the law, as modified, will not “impair enforcement of civil rights laws [thus] maintaining the vital and hard-fought protections from discrimination even when activities or services are mediated by internet platforms.”

The bill also carves out an exception for suits under the Alien Tort Claims Act. The goal, as explained in the press release, is to allow “victims of platform-enabled human rights violations abroad (like the survivors of the Rohingya genocide) to seek redress in U.S. courts against U.S.-based platforms.” As such foreigners would still be able to sue social media platforms, claiming, for instance, violation of international law. Facebook recognized in 2018 that its platform had been used by the Myanmar military to coordinate “inauthentic behavior,” referring thus to the Rohingya genocide. Facebook stated that its decision to remove these pages “ was based on the behavior of these actors rather than on the type of content they were posting,” thus acknowledging that online speech may lead to real world crime. As explained by Paul Mozur from The New York Times, “Myanmar military personnel… turned the social network into a tool for ethnic cleansing.”

 

Image (public domain) taken by Flickr user citytransportinfo 

Image (publicdomain ) taken by Flickr user John K Thorne

 

 

Facebooktwitterredditpinterestlinkedinmailby feather