Partisanship made the impasse worse. Republicans, some of whom accuse Facebook, Twitter and others of censoring it, have pressured the platforms to keep more content. By contrast, Democrats say platforms should remove more content, such as health misinformation.
A Supreme Court case challenging Section 230 of the Communications Decency Act could have many ripple effects. While newspapers and magazines can be sued for the content they publish, Section 230 protects online platforms from lawsuits for most content posted by their users. It also protects platforms from lawsuits when posts are removed.
For years, judges have invoked the law in dismissing lawsuits against Facebook, Twitter and YouTube, ensuring the companies don’t incur new legal liabilities for every status update, post and viral video. Critics say the law is a jail-break card for tech giants.
“If they’re not responsible on the back end for any harm they’ve enabled, then they basically have a duty to be as reckless as possible,” said Mary Anne Franks, a law professor at the University of Miami.
The Supreme Court has previously declined to hear several cases challenging the statute. In 2020, a court dismissed a lawsuit by the families of individuals killed in terrorist attacks that said Facebook was responsible for promoting extremist content. In 2019, a court declined to hear the case of a man who said his ex-boyfriend sent someone to harass him using the dating app Grindr. The man sued the app, saying its product was flawed.
But on Feb. 21, the court is scheduled to hear Gonzalez v. Google, a case brought by the family of an American killed in Paris by followers of the Islamic State. In the lawsuit, the family said Section 230 should not shield YouTube from accusations that the video site supported terrorism when its algorithm recommended Islamic State videos to users. The suit argues that recommendations can count as a form of platform-generated content in and of itself, making them exempt from Section 230 protection.
A day later, the court is scheduled to hear a second case, Twitter v. Taamneh. It touches on a related question of when platforms are legally liable under federal law for supporting terrorism.