Supreme Court frustrated and wary of legal protections for tech companies
WASHINGTON — In a case that has the potential to change the fabric of the internet, the Supreme Court on Tuesday explored the limitations of a federal law that protects social media platforms from legal liability for content users post on their sites.
The justices seemed to think the positions on both sides were too extreme and expressed doubts about their ability to find a middle ground. “It’s not the nine greatest pundits on the internet,” Justice Elena Kagan said of the Supreme Court.
Others have practical concerns. Judge Brett M. Kavanaugh said the court should not “disrupt the digital economy.”
The case was brought by the family of Nohemi Gonzalez, a 23-year-old college student who was killed at a Paris restaurant in a November 2015 terrorist attack that also targeted the Bataclan concert hall. The family’s lawyer, Eric Schnapper, argued that YouTube, a Google subsidiary, was liable because it used algorithms to push ISIS videos to interested viewers, using data collected by the company. information about them.
“We’re focused on recommendations,” Mr. Schnapper said.
But Judge Clarence Thomas said the advice was crucial to making the internet platform work. “If you’re interested in cooking,” he said, “you don’t want thumbnails of light jazz.” He later added, “I take those as advice, not really advice.”
Understanding the U.S. Supreme Court Term
Mr Schnapper said YouTube was to blame for its algorithm, which he said systematically recommended videos that incited violence and supported terrorism. The algorithm, he said, is YouTube’s speech, distinct from what the platform’s users post.
Judge Kagan pressured Mr. Schnapper to limit his arguments. Does he also take issue with the algorithms Facebook and Twitter use to generate people’s feeds? Or with the help of a search engine?
Mr. Schnapper said all of these people could lose their protections under certain circumstances, a response that seemed to surprise Judge Kagan.
Judge Amy Coney Barrett asked whether Twitter users could be prosecuted for retweeting ISIS videos. Mr. Schnapper said the laws involved in the case may allow such a lawsuit. “That’s what you create,” he said.
law, Section 230 of the Communications Decency Act, promulgated in 1996, when the Internet was still in its infancy. This was in response to a decision to make online message boards responsible for content posted by users, as the service already does some content moderation.
How Times reporters cover politics. We rely on our journalists as independent observers. So while Times staff members may vote, they may not endorse or campaign for candidates or political causes. This includes attending a march or rally in support of a movement, or donating or raising funds to any political candidate or cause.
“A provider or user of an interactive computer service shall not be considered the publisher or spokesperson of any information provided by another information content provider,” the clause says.
The clause helped fuel the rise of social networks like Facebook and Twitter by ensuring that sites are not legally liable for each post.
Malcolm L. Stewart, an attorney for the Biden administration, defended the family in Gonzalez v. Google, No. 21-1333. Successful litigation based on recommendations is rare, but immunity under Section 230 is generally not available, he said.
More about the U.S. Supreme Court
Lisa S. Blatt, a Google attorney, said the clause gave the company full protection from the lawsuit brought by Ms. Gonzalez’s family. YouTube’s algorithm is a form of editorial curation, like search engine results or a Twitter feed, she said. Without the ability to deliver content of interest to users, the internet would be a useless mess, she said.
“All publishing needs organization,” she said.
A ruling against Google would either force sites to remove any problematic content or allow all content, no matter how despicable, she said. “You’ll see ‘The Truman Show’ instead of a horror movie,” she said.
Judge Kagan asked Ms. Blatter whether Section 230 would protect “pro-ISIS” algorithms or algorithms that promote defamatory speech. Ms Blatter said yes.
Article 230 has drawn criticism from across the political spectrum. Many libertarians say it shields tech platforms from liability for disinformation, hate speech and violent content. Some conservatives say the provision makes the platforms so powerful that they can effectively keep right-wing voices out of the national conversation.
On Wednesday, the justices will hear arguments in a related case, also sparked by a terrorist attack. The case, Twitter v. Taamneh, No. 21-1496, was brought by the family of Nawras Alassaf, who was killed in a terrorist attack in Istanbul in 2017.
At issue in the case is whether Twitter, Facebook and Google can be prosecuted under the 1990 Anti-Terrorism Act for abetting terrorism by allowing Islamic State to use their platforms. Tuesday’s lawsuit against Google may be moot if the judge rejects it.
Whatever happens in the cases at issue this week, involving the interpretation of statutes, courts are likely to agree to consider looming First Amendment issues raised by laws enacted in Florida and Texas: the possibility that states could prevent large social media companies from The views they expressed?