The Supreme Court is stepping into a messy political fight next month over the meaning of a 26-word law that Big Tech firms describe as a linchpin of the modern internet but that critics say has led to the promotion of terrorism.
A closely watched dispute between YouTube owner Google and the family of an American killed in an Islamic State group attack in Paris in 2015 will put the court in the middle of a conflict about when internet companies may be successfully sued for content on their sites. It could also, as Facebook owner Meta put it, turn the internet into a “disorganized collection of haphazardly assembled information.”
Potentially at stake, the companies say, is the ability to get relevant results when searching Google for a local pizza shop or a video on how to perform CPR.
The case, Gonzalez v. Google, centers on a law known as Section 230 that drew intense criticism from former President Donald Trump over accusations that social media companies throttled conservative views. Many Democrats agree, for different reasons, that the law needs an update.
Why Google is at the Supreme Court
- The Supreme Court will hear arguments on Feb. 21 in a case about whether internet platforms may be liable for targeted recommendations, such as when YouTube suggests a follow-up video. The family of an American killed in a terrorist attack says YouTube, through its algorithms, recommended videos that aided extremists.
- Google and other internet platforms say recommending content is a “central building block” of the internet and that recommendations aren’t endorsements. If platforms are liable for suggestions, they say, it could change what gets recommended on many different platforms.
What’s Gonzalez v. Goabout?
Nohemi Gonzalez was a 23-year-old American studying in Paris when members of the Islamic State group fired into a crowd at a bistro, killing her. She was one of 130 people killed in a coordinated attack across the city.
Gonzalez’s relatives sued Google alleging the company aided the Islamic State group by promoting its videos on YouTube.
Section 230, enacted when Americans were dialing into the internet, is widely interpreted as shielding Google from liability for hosting the videos. What’s at issue is whether recommending the videos to users, through the company’s algorithms, is also shielded.
It’s one of several major lawsuits percolating around internet regulation. The justices will hear a related case next month about whether social media companies may be sued under the Anti-Terrorism Act for “aiding and abetting” the Islamic State group. Separately, the court delayed a decision on whether to grant a review of laws in Texas and Florida that make it harder for social media companies to moderate content.
In those cases, the Supreme Court asked the Biden administration to offer the government’s view before deciding whether to hear them later this year or next.
How Google could win
Section 230 says that an internet company can’t be treated as a publisher of content posted on its platform by a user. Google, and many lower courts, read that as also protecting the “dissemination” of content online. Recommendations like those offered by YouTube, Big Tech firms say, are part of that dissemination.
“Recommendation algorithms are what make it possible to find the needles in humanity’s largest haystack,” Google told the Supreme Court.
“The line between just providing access to content and actively promoting that content is a lot slipperier than initially appears,” said Christopher Yoo, a University of Pennsylvania law professor. If the court rules against Google broadly, he said, “it may sweep in every finding tool we have on the internet today.”
How Gonzalez could win
Others say that lower courts have read too much protection for internet platforms into the words of the law. In a brief that doesn’t take a side in the case, Sen. Ted Cruz, R-Texas, and other GOP lawmakers question how much Section 230 shields the companies from liability at all.
Conservatives see the case as part of a bigger fight against what they view as biased content moderation.
“The whole goal was to prevent things like sex trafficking and proliferation of child pornography,” said Sarah Parshall Perry, senior legal fellow at the conservative Heritage Foundation. “What it has become…is a shield for the moderation of content in keeping with the particular political perspectives of the major tech companies.”
Others oppose Google’s interpretation for different reasons. Common Sense Media, a child advocacy group, pointed to another practice by Big Tech it believes shouldn’t be shielded from liability: The collection of personal data.
“What gets recommended is based on the data collection,” said Jolina Cuaresma, the group’s senior counsel of privacy and tech policy. “Adolescents are really at an unfair disadvantage here, because their brains are structurally different than ours.”
A decision is expected this year.
Contributing: Jessica Guynn