Less than a week after issuing an order vacating its March 2021 opinion in a major Communications Decency Act (“CDA”) case and granting a petition for reconsideration, the Second Circuit issued a new opinion reaffirming its Section 230 CDA “protection” for the video sharing site Vimeo, Inc. (“Vimeo”) (Domain v. Vimeo, Inc.no. 20-616 (2d Cir. 21 July 2021) (modified opinion)).
It’s not entirely clear why the Second Circuit decided to grant a hearing and change its original opinion only to achieve essentially the same attendance. It is possible that, given the scrutiny surrounding the CDA, the court saw fit to narrow the language of its original ruling so that it could insulate its ruling from possible Supreme Court review (remember, Justice Thomas previously issued a declaration following a denial of certiorari in an earlier CDA case, that “in an appropriate case”, the Court should consider whether the text of the CDA “is in line with the current state of immunity enjoyed by internet platforms”). The Second Circuit’s second decision probably watered down some of its strongest claims in its earlier opinion enunciating broad CDA immunity (for example, even swapping the word “immunity” for “protection” when discussing the CDA). The court even reflected in dicta near the end of the opinion on the types of claims that might fall outside the CDA’s protection, as if to indicate that the immunity of CDA section 230 is broad, but not as broad as its detractors suggest.
Yet despite the narrowing of its original opinion, the court reached the same result by the same reasoning. Like in original opinion (now vacant). Since March 2021, the Second Circuit’s amended decision has been based on Section 230(c)(2), the Good Samaritan provision, which allows online providers to self-regulate the moderation of bona fide third-party content without fear of liability. Unlike the original opinion, in the second instance the appeals court also rejected the plaintiff’s claims on the merits, deeming that the allegations of discrimination based on the presence of similar videos uploaded by other users and left on the site were “nuances” ( thus further reducing the possibility of review by the Supreme Court).
The case involved a user who challenged Vimeo’s decision to close his account of the ‘Church United’ organization for posting objectionable videos that violated Vimeo’s terms. The Vimeo Platform Terms prohibits, among other things, content that “(c) contains hateful, defamatory, or discriminatory content or incites hatred against any individual or group.” The terms also reference Vimeo’s Guidelines, which state that moderators will generally remove, among other things, videos that promote sexual orientation change (“SOCE”) efforts. The videos in question have been marked by Vimeo as promoters of SOCE. Subsequently, the plaintiff was advised to remove the videos within 24 hours or Vimeo could remove the videos or terminate his account. When the plaintiff hadn’t removed the videos, he received an email informing him that his Vimeo account had been closed. The plaintiff contested the termination and filed various state discrimination claims against Vimeo.
In dismissing the complaint, the trial court had found that Vimeo was immune from plaintiff’s claims based on two aspects of CDA’s immunity: the most commonly invoked, § 230(c)(1), which provides immunity for third party “online publishers” -biased content, and also under § 230(c)(2), the “Good Samaritan” screening provision, which immunizes providers for good faith actions to police objectionable content . (Domain v. Vimeo, Inc., 433 F.Supp.3d 592 (SDNY 2020)). On appeal, the Second Circumscription panel, in its original decision of March 2021, decided the case based solely on the “Good Samaritan” blocking grounds and ruled that Vimeo was free to moderate to restrict access to material it, in good faith, finds objectionable, even if that moderation is flawed. The court also summarily dismissed plaintiff’s claims that Vimeo closed his account in “bad faith.”
The Second Circuit second opinion in case you tread the same ground.
The court held that it was a simple case where an online provider moderates and removes content that expressly violates its content guidelines, all with the protection of the CDA:
“Vimeo’s deletion of the complainants’ account was not anti-competitive or selfish behavior done in the name of content regulation. Instead, it was a direct result of Vimeo’s content policies, which Vimeo communicated to Church United before deleting its account. In fact, the policy was communicated to Church United before it even joined the platform.
The court also further pursued plaintiff’s argument that the alleged presence of other similar questionable videos remaining available on the site meant that Vimeo’s actions were not done in good faith. The court noted the difficulty of managing the breadth of content hosted on a large platform. The appeals court described the CDA as protecting bona fide blocking of providers under Section 230(c)(2) even if not all objectionable content is flagged and removed, reaffirming Congress’s goal that the CDA remove disincentives for the development and use of blocking technologies.
“(T)he mere fact that the applicants’ account was deleted while other videos and accounts discussing sexual orientation remained available does not imply bad faith. One purpose of Section 230 is to provide interactive computer services with protection from litigation for the removal of “some, but not all, offensive material from their websites,” as Vimeo has done here. Given the sheer amount of user-generated content available on interactive platforms, the imperfect exercise of content police discretion does not, without further ado, suggest that the enforcement of content policies has not been done in good faith. (quotations omitted).
Ultimately, the assertion under the Good Samaritan provision is likely to discourage protracted litigation in this case and further empower other online platforms to restrict access to harassing or objectionable content that violates the site’s terms.
Practical implications and lessons learned
There is a flood of CDA reform bills piling up in Congress, including a new bill introduced last week that would cut out an exception to CDA protection for misinformation during a public health emergency. Even with a new administration, a bipartisan desire to amend the CDA remains (although the justifications for such reform differ across party lines). Many CDA reform bills contain provisions that aim to increase transparency and bind vendors to stated terms and content guidelines, otherwise they lose CDA immunity for filtering decisions. THE Video case is a prime example of a vendor that has posted site terms and content policies governing user content, followed those procedures, communicated the possible consequences of noncompliance, and then taken editorial action that she was immunized under the CDA. As the court summed up:
“Section 230(c)(2) protects from liability providers and users of interactive computer services who voluntarily make good-faith efforts to restrict access to material they consider objectionable. . . . Here, Vimeo did just that: he removed the complainants’ account for expressing pro-SOCE views that he in good faith considers objectionable. (…) (Quellante) ignored Vimeo’s notice of their infringement and, as a result, Vimeo canceled their account. In suing Vimeo, Appellants stumble headfirst into Section 230, which ‘allows computer service providers to set standards of decency without risking liability for doing so.’” (citations omitted).
Of course, not all moderation decisions will concern content expressly prohibited by a site’s content policies (and some moderation measures may be urgent and require immediate action). However, sites should take a moment to ensure they have broad terms designed to give the user a heads up as to what is and isn’t allowed on their site, and give the service ample room to filter out a wide variety of offensive material. In this case, Vimeo’s content guidelines actually listed the content in question as prohibited on the site. Of course, there is no way to list every type of objectionable content, but detailing the types of content that are harmful or harassing and against the site’s terms could be helpful in future litigation to give courts an easier path to enforce the CDA immunity and end a case as soon as possible.
In keeping with the appellate court’s narrowing of its original broad holding, one of the new additions to the amended opinion was dicta near the end in which the court suggests some limitations of the CDA’s defense.
“Our decision should not be read to confer immunity on suppliers acting in circumstances far removed from the facts of this case. The courts rejected Section 230 defenses against claims of false advertising, deceptive business practices, and unlawful interference. Judges, commentators, and the executive branch alike have expressed concern about Section 230’s potential to protect companies engaging in anticompetitive behavior. Some claims that ring in contract or tort may be outside the scope of protection under Section 230(c)(2) from the lawsuit. Our decision applies to the limited circumstances of this case and similar claims.
The court’s rulings on claims that potentially fall outside the CDA are fairly general (and not exhaustive), and you may find decisions involving some version of these claims where online vendors ultimately prevailed on the basis of the CDA. However, online vendors should look into this long list for clues about how litigants might be trying to circumvent CDA immunity in future cases and avoid an early termination.
Ultimately, however, Video it’s an important decision, as it further solidifies a developing body of case law in the Second Circuit that interprets broad immunity under the CDA. The detention was based on Good Samaritan immunity §230(c)(2) and, therefore, is likely to be well-cited in future cases, given the relative paucity of precedents in the area. Meanwhile, on the legislative front, we will be closely following developments surrounding the push for CDA reform.