The US Supreme Court handed social media companies a new line of defense as they face an increasing number of lawsuits alleging their algorithms have harmed users — whether by addicting children or selling suicide kits.
In a unanimous ruling, the justices knocked down a case alleging Alphabet Inc.’s Google, Twitter Inc. and Meta Platforms Inc. should be held liable for “aiding and abetting terrorism” by hosting and recommending Islamic State videos and posts to their users.
The court said that under a federal antiterrorism law, social media companies can’t be held responsible simply for deploying algorithms that in some cases recommend harmful content. That’s an argument that the tech giants will likely point to as they face cases across the country on other issues.
“The court’s decision is pretty clear that generally providing a platform, even if that platform has recommendation algorithms, is not enough to find liability just because some people used that platform to do criminal things,” said Evelyn Douek, a professor at Stanford Law School who studies online speech. “So platforms will be able to point to that general principle in other cases in the future.”
Read More: Supreme Court Keeps Social Media Legal Shield in Google Win
The social media companies notched a significant win on Thursday when the Supreme Court declined to weigh in at all on the issue of Section 230, a statute that prevents the online platforms from facing most lawsuits over posts by their users. That means the high court likely won’t try to pare back the tech industry’s prized legal shield anytime soon.
‘Agnostic’ Algorithms
But social media’s headaches go beyond just the fate of Section 230. Meta and other social media companies will soon face consolidated litigation in California over claims that Facebook, Instagram, and other platforms cause addiction and self-destructive behavior in adolescents. That case was on hold as the Supreme Court considered its social media cases. Meanwhile, Amazon.com Inc. is facing a lawsuit accusing the platform of selling suicide kits to teenagers.
Tucked into the court’s opinion is an assertion about the nature of social media algorithms that the companies may be able to point to in battling lawsuits over these other topics.
“As presented here, the algorithms appear agnostic as to the nature of the content, matching any content (including ISIS’ content) with any user who is more likely to view that content,” wrote Justice Clarence Thomas on behalf of the court. “The fact that these algorithms matched some ISIS content with some users thus does not convert defendants’ passive assistance into active abetting.”
There are significant portions of Thursday’s decision that the big tech companies will likely use to their advantage, experts said. Thomas in his opinion also wrote that social media platforms do not have a “duty” to remove customers using their platforms for illicit activity.
“The court categorically states these internet services do not owe a duty of care to their users with regards to the Anti-Terrorism Act,” said Jess Miers, legal advocacy counsel with tech-funded group Chamber of Progress. Miers previously worked for Google.
“Plaintiffs in the past have tried to claim internet services have a ‘duty’ to their users, but they don’t actually have that legal relationship with users,” Miers said.
Punting to Congress
The debate over social media’s culpability in enabling the spread of harassment, violence, hate speech, misinformation and more is only set to intensify. Some lawmakers on Thursday said the Supreme Court missed an opportunity by failing to weigh in on Section 230 and said it’s time for Congress to step in.
“Companies turn a blind eye to child sexual abuse material, drug sales, and border smuggling on their platforms while rushing to censor free speech,” said Tennessee Republican Senator Marsha Blackburn. “Congress must reform the law.”
But efforts to reform Section 230 have failed amid partisan fighting for years. For now, the Supreme Court’s opinions on Thursday will likely boost the social media companies’ arguments. The high court this summer will decide whether to weigh in on laws in Texas and Florida that bar internet platforms from removing political speech. Those cases would tee up a novel set of questions about social media’s relationship with the First Amendment, and legal experts are divided over whether the initial social media cases provide any glimpses into what will happen next year.
“The court seemed to understand the content moderation difficulty of online services,” said Miers. “They seem to appreciate the technology.”