Section 230 of the 1996 Communications Decency Act is only 26 words long. But those 26 words and how U.S. courts have historically interpreted them are key reasons the internet has developed into what it is today.
And now, the U.S. Supreme Court is considering how and to what extent Section 230 can shield internet platforms from liability over content created by users. The two appeals, which went to oral arguments in front of the court at the end of February, could have significant implications.
Section 230 and Courts
Part of the larger Telecommunications Act of 1996, Section 230 of the Communications Decency Act specified that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Since its enactment, courts across the U.S. have taken a broad interpretation of how Section 230 protects internet platforms from liability over content created by users.
“I think anybody who does technology law or internet law, Section 230, is just sort of the foundation, for better or for worse, of American internet law,” said Blake Reid, a clinical professor at the University of Colorado Law School who teaches and practices at the intersection of law, policy and technology.
Reid added historically, courts have interpreted Section 230 as a wide liability protection for internet platforms from content uploaded and created by users. He said that with a few exceptions, the consistent interpretation of Section 230 as a wide liability shield has been instrumental in the development of modern internet platforms, ranging in size from Google, Twitter and Facebook, to comment sections on a local newspaper’s website, that in turn have shaped modern political discourse, personal relationships, workplaces and more.
Questions about the broad protections of Section 230 and their potential harm on people have long been discussed by policymakers and academics.
“It’s led to what I call a quarter century of interpretive debt,” said Reid, who explained that since courts have adopted a consistent interpretation of Section 230, they haven’t had to grapple with threshold questions of what existing laws can apply to internet platforms.
But now, for the first time, the U.S. Supreme Court is considering Section 230.
The Cases at Hand
The U.S. Supreme Court granted review and heard oral arguments last month in two related appeals out of the 9th Circuit.
The first case, Reynaldo Gonzalez et al. v. Google, was brought by the family of Nohemi Gonzalez, a 25-year-old American student killed during the November 2015 attacks by ISIS terrorists in Paris.
Her family brought a complaint against Google as the owner of YouTube under the Anti-Terrorism Act which creates causes of action against parties that provide material assistance and aid and abet terrorist conduct. The Anti-Terrorism Act was amended in 2016 with the Justice Against Sponsors of Terrorism Act to include secondary civil liability in acts of international terrorism. The lawsuit alleges Google was directly and secondarily liable for Nohemi Gonzalez’s’ death by allowing ISIS to post videos to YouTube and promoting videos through its recommendations algorithm that were critical to the growth of ISIS.
The Gonzalez family’s lawsuit was dismissed by a federal district court based on Section 230. A panel of the 9th Circuit Court of Appeals last year upheld the dismissal and the Supreme Court granted review in October 2022 to consider whether or not Section 230 “immunize[s[ interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limit[s] the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information?”
The second case, Twitter v. Mehier Taamneh et al., was brought by the American citizen relatives of Nawras Alassaf, one of 39 people killed in 2017 by Abdulkadir Masharipov who opened fire at a nightclub in Turkey.
The lawsuit alleges the attack was done at the direction of ISIS and was filed against Twitter, Google and Facebook. It claimed since ISIS followers (but not Masharipov specifically) used their platforms and moderators failed to remove all accounts and posts of these users, the platforms aided and abetted terrorists under the Anti-Terrorism Act.
A federal district court dismissed the lawsuit reasoning the plaintiffs’ claims didn’t state plausible grounds for relief without reaching whether or not Section 230 barred the claims.
The 9th Circuit reversed the dismissal in the same opinion as the Gonzalez appeal and found the Anti-Terrorism Act claims were plausible and since the Section 230 claims were never addressed, the lawsuit could be reinstated. The 9th Circuit declined to rehear the case and on remand, both parties agreed to automatic dismissal if the U.S. Supreme Court declined to hear Gonzalez.
However, since the court agreed to hear Gonzalez, Twitter asked the court to hear two questions concerning liability under the Anti-Terrorism Act: Is a defendant that offers generic, widely available services and works regularly to detect and prevent terrorists from using its services liable under the act since it could have taken more “meaningful” or “aggressive” actions to prevent terrorist use? And, if a defendant’s generic, widely available services were not used in connection to a specific act of terrorism that caused the plaintiff injury, are they still liable for aiding and abetting the act?
The cases went to oral arguments in front of the court Feb. 21 and Feb. 22. The Supreme Court will likely publish its opinion on the cases later this year.
Drawing Virtual Lines
As the first SCOTUS cases to weigh Section 230, Reid said the appeals could have significant impacts on internet platforms and the billions of people who use them every day.
“The legal issues are complicated and esoteric, but the implications are hugely serious,” emphasized Reid.
“This is the first time we’re revisiting those kinds of threshold questions that all the courts have been so consistent about that we always kind of just treated as settled law,” he said. “But now, the Supreme Court is coming in and taking a wider-ranging inquiry than courts have ever sort of taken before on this.”
Reid doesn’t have predictions for how the court will rule. He said there’s no shortage of possibilities ranging from dismissing the appeal as improvidently granted to upending Section 230’s protections and everything in between.
But a few things did stick out to Reid from oral argument about the tough and abstract questions the Supreme Court is being asked to decide.
At oral arguments, he said the justices appeared to be looking for a line that separates a user’s speech from a platform’s speech. In Gonzalez, that came down to questions about YouTube’s video thumbnail creation algorithm and its recommended videos algorithm.
Algorithms are a set of rules that help determine how content is organized and presented. Offline, the Dewey Decimal System or an alphabetically organized phone book, are examples of algorithms. But online, how content is organized can range in complexity and Reid said the justices seemed to grapple with where to draw the line when it comes to the impact of algorithms on content.
While a simple algorithm that presents content based on upload time, for example, might not cross the line, algorithms that present content based on multiple factors such as previously watched videos, user demographics and more are less clear cut.
“The justices I think were very attuned to the possibility that on the other end of the spectrum, you could get a very elaborate sort of recommendation algorithm … that is now this is the platform adding something new and, whatever that something new is, that’s the platform’s that’s not the content,” explained Reid.
Reid also thinks the justices may have realized the appeals are so complex they might not be the best vehicles to address Section 230.
“I think the questions indicated … the justices are understanding that it’s a pretty complicated case, that it’s not a straightforward case, that it’s not a really exceptional case. It’s aimed right at the heartland of how a lot of internet platforms operate and there are real consequences to how the ruling will come out,” said Reid.
To Reid, the appeals and their potential impacts also bring up larger questions around the development of the internet, how lawmakers haven’t kept pace and how courts fit into the picture.
“I think this is actually a case where the technology that we’re talking about here is entirely enabled by the law,” said Reid. Since federal lawmakers haven’t passed Section 230 reform, he said, internet platforms and courts have relied on precedent and developed under the understanding they have broad liability protections.
If the court overhauls Section 230’s protections, Congress will likely need to step in to clarify how Section 230 does or doesn’t apply to internet platforms, Reid said. But with the increased politicization of technology, that might not be a simple or timely task.
At the end of the day, though, Section 230 and how the court’s ruling will impact it, is far from simple.
“I think if there’s any ground truth on this at all, it’s that this is really complicated, there are no straightforward answers,” said Reid.