Since the passage of Section 230 of the Communication Decency Act (“CDA”), most federal circuits have interpreted the CDA to establish broad federal immunity from causes of action that would treat service providers as publishers of third-party content . The CDA was passed in the early days of e-commerce and was written broadly enough to cover not only the online message boards and dull websites that were common then, but also more modern online services, web 2.0 offerings and today’s platforms that may use algorithms to organize, repackage, or recommend user-generated content.
Over 25 years ago, the Fourth Circuit, in the landmark Why case, the first major circuit court-level decision interpreting Section 230, held that Section 230 prohibits lawsuits, which, in essence, seek to hold a service provider accountable for its exercise of “editorial duties traditional rules of a publisher – such as deciding whether to publish, withdraw, postpone or alter content.” Courts have since generally followed this reasoning in determining whether an online vendor is treated as a “publisher” of third-party content and therefore entitled to immunity under the CDA.The scope of “traditional publishing functions” is at the center of a case currently on the agenda in the Supreme Court.On October 3, 2022, the Supreme Court upheld certiorari in an appeal challenging whether the targeted algorithmic recommendations of a social media platform fall within the scope of the “editorial functions traditional” protected by the CDA or if such recommendations are not the actions of an “editor” and therefore do not fall within the immunity of the CDA. (González v. Google LLCno. 21-1333 (US cert. granted Oct. 3, 2022)).
In Gonzalez, the Ninth Circuit upheld the District Court’s dismissal of Anti-Terrorism Act (ATA) (ATA) complaints against Google, 18 USC § 2333, for allegedly providing “material support” to ISIS by allowing terrorists to use YouTube (temporarily, before known accounts were terminated or content blocked) as a tool to facilitate the recruitment and commission of terrorism. The court held that Google was entitled to CDA immunity for most claims, concluding that “a website’s use of content-independent algorithms, without doing anything else, does not expose it to liability for content posted by third parties” and that since the early days of the Internet, websites have always decided how to display third-party content and to whom it should be shown, but “no jurisprudence denies § 230 immunity because of the results of ‘matchmaking’ of such editorial decisions”. This decision followed the reasoning of a trio of notable circuit court-level opinions rejecting the plaintiffs’ attempt to terminate the CDA on the grounds that online service providers lose immunity if they recommend or reword algorithmically the content in another form to other users of the site.
The application filed in Gonzalez’s petition is: “Section 230(c)(1) immunizes interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limits the liability of interactive computer services when they engage in traditional editorial functions (such as decide whether to show or withdraw) in relation to such information?”
Before the advent of the web, a particular edition of a print newspaper was static: everyone who received that edition saw the same newspaper, with the same front page and the same headlines. Today, social media platforms, using algorithmic tools, can recommend, highlight or share dynamic content based on user profile and previous inputs. , revenge porn, defamation, disinformation, terrorist materials and other objectionable content remain a reality on online platforms with millions (or billions) of users. THE Gonzalez The case centers on instances where a platform’s neutral automated tools unknowingly boost or recommend such content to users. The signatory on Gonzalez argued that we have strayed from the path Congress set us before when it passed the CDA, and that automated recommendations of harmful content fall outside the CDA’s protections and are not a publisher’s “traditional editorial function.” However, multiple circuit courts have issued opinions (sometimes on dissenting judges) ruling that such algorithmic recommendations I am editorial decisions protected under the CDA, not unlike when a newspaper publisher decides to be featured on the front page.
A Supreme Court decision narrowing the scope of what is “posting” activity under the CDA will no doubt have a huge impact on social media platforms and other providers that host third-party content. Many types of online services (e.g., dating apps, search engines, online communities, e-commerce businesses, virtual worlds) use algorithms designed to match third-party information or profiles with a consumer’s implied interests, or organize and distribute third-party information to form connections or foster engagement. If the Supreme Court accepts Gonzalez’s “matchmaking” argument, such a ruling would create a major CDA purge, and a modern vendor could lose immunity in cases where automated tools are used to organize and recommend content provided by third parties set off. Indeed, Internet services have also long relied on CDA immunity to use automated editorial tools to repackage or highlight third-party content displayed to users based on, among other things, users’ geolocations, chosen language, and profile information. And in practice, one of the reasons section 230 of the CDA is so effective is that providers can often dispense with third-party content lawsuits at the stage of applying for dismissal: one can imagine if the Court rules that Algorithmic tools do not fall under the CDA, so future claims against providers will be created with this cut in mind and such litigation could be protracted and delayed in discovering what automated tools may have been used and how content may have been repackaged.
This is a case we will be following closely. As the petitioner rightly states, whether CDA immunity covers recommendations generated by algorithms “is of enormous practical importance”. With no consensus in Congress on how to reform the CDA (a delicate task as it is difficult to regulate objectionable content online without impacting the vibrant Internet), there is a possibility that the Supreme Court could reform itself and change the way platforms online manage content in the future. If so, the principle of unintended consequences will surely play a role in the ongoing operations of most online businesses in the future.