CNN
—
The Supreme Court is set to hear oral arguments in the first of two cases this week that could reshape how online platforms handle speech and content moderation.
The oral arguments on Tuesday for a case called Gonzalez v. Google, which zeroes in on whether the technology giant can be sued because of the algorithmic promotion of the YouTube subsidiary YouTube of terrorist videos on its platform.
According to the plaintiffs in the case – the family of Nohemi Gonzalez, who was killed in the 2015 ISIS attack in Paris – YouTube’s targeting recommendations violated US anti-terrorism law by helping to radicalize viewers and promote the ISIS worldview.
The allegation seeks to muzzle content recommendations so they don’t get protections under Section 230, a federal law that has largely shielded websites from lawsuits over user-generated content. If successful, it could expose technology platforms to a range of new laws and could reshape how social media companies run their services.
“I don’t want my daughter’s life to be washed away like that. I want something to be done,” said Beatriz Gonzalez, Nohemi’s mother, in an interview with CNN. “We are in search of justice. Someone has to be responsible for what happened. Not just for me, but for many other families who have lost loved ones.”
Nitsana Leitner, the attorney for the Gonzalez family, told CNN that Google should be held liable because the company benefited from the terrorist group’s activities by allowing ISIS videos to circulate on the platform.
“If you use the material to your advantage, you have to pay for your wrongdoing,” Leitner said.
Google and other tech companies have said that targeted recommendations are exempt from the Section 230 exemption it would increase the legal risks associated with the classification, sorting and curation of online content, a fundamental feature of the modern internet. Google has demanded that in such a situation, websites should try to play it safe by removing much more content than necessary, or by relaxing content moderation altogether and allowing more harmful content on their -platforms.
Friend of the court filings with Craigslist, Microsoft, Yelp and others suggested that the stakes are not limited to algorithms and could affect almost anything on the web that could be construed as making a recommendation. That could mean even average internet users who volunteer as moderators on various sites could face legal risks, according to a filing by Reddit and several Reddit volunteer moderators.
Oregon Democratic Senator Ron Wyden and former California Republican Representative Chris Cox, the original co-authors of Section 230, argued to the Court that the intent of Congress in passing the law was to give websites broad discretion regarding moderate content. they fit.
The Biden administration also weighed in on the situation. In a brief filed in December, he argued that Section 230 protects Google and YouTube from lawsuits “for failure to remove third-party content, including content it recommended.” But, the government’s brief argued, those protections don’t extend to Google’s algorithms because they reflect the company’s own speech, not the speech of others.
On Wednesday, the Court will hear arguments in the second case, Twitter v. Taamneh. It will determine whether social media companies can be sued for aiding and abetting a specific act of international terrorism when the platforms host user content that shows general support for the group behind the violence without reference to the specific act of terrorism involved. question.
Rulings for both cases are expected by the end of June.
– CNN’s Jessica Schneider contributed to this report.