ISIS terrorists, Big Tech companies, and Section 230 protections collided at the Supreme Court on Tuesday as justices heard arguments in Gonzalez v. Google, a case that grapples with what liability, if any, the massive tech company's YouTube video sharing platform could bear when it comes to recommended content based on its algorithms.
As Amy Howe explained over at SCOTUSblog, the case bears the name of American Nohemi Gonzalez, who was killed in a 2015 ISIS attack in Paris at just 23 years old. Her family brought the case against YouTube's parent company alleging that Google aided ISIS by allowing videos — aimed at inciting violence and recruiting new members — to exist on its platform and be served to users by algorithms Google built.
Howe's SCOTUSblog review of the Gonzalez v. Google continues explaining how the case made it to the Supreme Court and the arguments being made:
A divided panel of the U.S. Court of Appeals for the 9th Circuit ruled that Section 230 protects such recommendations, at least if the provider’s algorithm treated content on its website similarly. The majority acknowledged that Section 230 “shelters more activity than Congress envisioned it would.” However, the majority concluded, Congress – rather than the courts – should clarify how broadly Section 230 applies. The Gonzalez family then went to the Supreme Court, which agreed last year to weigh in.
In the Supreme Court, the Gonzalez family insists that recommendations are not always shielded from liability under Section 230. Whether they are protected, the family says, hinges on whether the defendant can meet all of the criteria outlined in Section 230, which bars providers of “an interactive computer service” from being “treated as the publisher … of any information provided by” a third party. For example, the family argues, Section 230 does not protect a defendant from liability for recommendations that contain material that the defendant itself created or provided, such as URLs for the user to download or “notifications of new postings the defendant hopes the user will find interesting,” because in that scenario, the information would not be provided by someone else.
A website like YouTube is also not shielded from liability, the family continues, when it provides unsolicited recommendations that it thinks will appeal to users. In that scenario, the family asserts, the defendant is not providing access to a computer server (because the user is not making a request) and therefore is not acting as a “provider … of an interactive computer service.”
Justice Elena Kagan quipped during arguments on Tuesday that she and her Supreme Court colleagues "really don't know about these things" and "are not, like, the nine greatest experts on the Internet."
Kagan continued by acknowledging "the difficulty of drawing lines" in the area of Big Tech protections and suggesting that Congress should be the entity to make any changes to the Communications Decency Act or Section 230 if tweaks are deemed necessary to the law that was passed in 1996 and may now be too outdated to deal with what the Internet has become in the 21st century.
"We're a Court. We really don't know about these things. These are not, like, the nine greatest experts on the Internet."— Howard Mortman (@HowardMortman) February 21, 2023
-- #SCOTUS Supreme Court Justice Elena Kagan ... #GonzalezvGoogle pic.twitter.com/loyONdfss8
The Supreme Court will hear another similar case on Wednesday, when arguments in Twitter v. Taamneh take place, and be asked to "decide whether Twitter (along with Facebook and Google, which were also defendants in the lower courts) can be held liable, regardless of Section 230, for aiding and abetting international terrorism based on ISIS’s use of the companies’ platforms," also according to Howe.
That lawsuit was brought by relatives of Jordanian citizen Nawras Alassaf who was killed in a 2017 ISIS attack in Istambul. The case was first filed in a California federal court under the Antiterrorism Act "which allows U.S. nationals to sue anyone who 'aids and abets, by knowingly providing substantial assistance,' international terrorism."
According to Howe, the case argues that "Twitter and the other tech companies knew that their platforms played an important role in ISIS’s terrorism efforts but, despite extensive press coverage and government pressure, did not act aggressively to keep ISIS content off those platforms."
Will the Supreme Court decide to uphold the Communications Decency Act and its Section 230 protections for tech companies as they stand? Will they decide they're obligated to change any portions of the law because it shelters too much activity? Or will they punt any decision on how Section 230 should exist across the street to Americans' elected representatives in Congress? Stay tuned for more from Wednesday's arguments.