lady justice statue with red blindfold, holding phone with YouTube app

Jason Reed/The Daily Dot

Supreme Court takes on case that could be most serious threat to Section 230 immunity yet

The suit was filed by the family of an American woman slain in France.

 

Claire Goforth

Tech

Posted on Oct 3, 2022

The United States Supreme Court agreed to hear a case challenging tech companies’ immunity under Section 230 of the Communications Decency Act. The suit, filed by the family of an American woman slain in France during a series of coordinated terrorist attacks on Muslims, alleges that YouTube facilitated the spread of Islamic terrorists’ content that helped inspire the slayings.

Section 230 grants broad immunity to tech companies for content posted on their sites. The law has come under fire in recent years. Conservatives argue the law enables tech companies to disproportionately ban and penalize them; liberals claim it shields companies from liability for enabling abuse and harassment and spreading disinformation.

Those who defend the law argue that removing this immunity could put social media companies out of business or lead them to severely curtail speech because they would be opened to liability for content posted on their platforms.

The family of Nohemi Gonzalez, the woman whose murder is at the center of the suit, claims that Google-owned YouTube’s algorithm enabled ISIS to reach a larger audience by feeding them its posts. NBC reports that they allege that YouTube’s use of the algorithm overcomes Section 230 immunity because it’s effectively taking an active role by selecting what content to show users. They claim the company “knowingly permitted ISIS to post on YouTube hundreds of radicalizing videos inciting violence,” videos they say enabled the terrorist organization to recruit members and helped inspire the murders of Gonzalez and other Muslims in France in 2015.

Social media companies use algorithms to select content for users based on their usage history. Algorithms are designed to keep people on the platform as long as possible, thus increasing engagement and revenue. Algorithms have long been criticized for having unintended effects, such as showing people increasingly extremist and conspiracist content.

A 2019 report found that users who watched sexually-themed content on YouTube were recommended videos of younger and younger people and eventually children. YouTube reacted by banning children 13 and younger from live streaming without an adult present, altering its algorithm, and removing some videos that had apparently become popular with pedophiles.

Last year, a bipartisan group of congresspersons proposed a law that would require platforms to offer an algorithm-free setting in which their timeline appears chronologically, rather than by algorithmic curation. Most major platforms, including YouTube, already offer this option. The law, the Filter Bubble Transparency Act, didn’t make it out of committee.

Legislatures and the courts have continued to attack Section 230 and big tech companies.

A federal court originally dismissed the Gonzalez family’s suit, which sought to find YouTube liable of violating the Anti-Terrorism Act. An appeals court subsequently revived it.

Google urged the court to decline to hear the case, arguing that it was unlikely to succeed irrespective of whether the court found that Section 230’s immunities didn’t apply.

On Monday, the U.S. Supreme Court sided with the Gonzalez family and decided to hear the case. It will also hear a case against Twitter that similarly claims it’s not shielded from liability when terrorist groups use its platform for recruitment and radicalization.

web_crawlr
We crawl the web so you don’t have to.
Sign up for the Daily Dot newsletter to get the best and worst of the internet in your inbox every day.
Share this article
*First Published: Oct 3, 2022, 2:26 pm CDT