Google to SCOTUS: Liability for Promoting Terrorist Videos Will Ruin the Internet

Google to SCOTUS: Liability for Promoting Terrorist Videos Will Ruin the InternetExpand NurPhoto / Contributor | NurPhoto

For years, YouTube has been accused of enabling the recruitment of terrorists. This would happen when a user clicks on a terrorist video hosted on the platform and then descends into a rabbit hole of extremist content automatically queued "next" via YouTube's recommendation engine. In 2016, the family of Nohemi Gonzalez, who was killed in a 2015 Paris terror attack after extremists allegedly relied on YouTube for recruitment, sued YouTube owner Google, forcing courts to look into YouTube's alleged role in aiding and abetting terrorists. Google has been defending YouTube ever since. Then, last year, the Supreme Court agreed to hear the case.

Now the Gonzalez family hopes the High Court will accept that Section 230 protections designed to shield websites from liability for hosting third-party content should not be extended to also protect the rights of platforms from recommending harmful content.

Google thinks this is exactly how liability protection should work. In a court filing yesterday, Google argued that Section 230 protects YouTube's recommendation engine as a legitimate tool "intended to facilitate the communication and content of others".

"Section 230 includes sorting of content via algorithms by defining "interactive computing service" to include "tools" that "select, choose", "filter", "search, subset, organize" or "reorganize" the content", argued Google. "Congress intended to protect these functions, not simply to host third-party content."

Google claimed that denying that Section 230 protections apply to YouTube's recommendation engine would remove protections protecting all websites that use algorithms to sort and display relevant content from YouTube's recommendation engine. search at online shopping websites. This, Google warns, would trigger "devastating spillover effects" that would turn the Internet "into a disorganized mess and contentious minefield" - which is exactly what Section 230 was designed to prevent.

It seems that, according to Google, a decision against Google would turn the Internet into a dystopia where all websites and even individual users could be sued for sharing links to content deemed offensive. In a statement, Google General Counsel Halimah DeLaine Prado said such liability would lead some larger websites to excessively censor content out of extreme caution, while websites with fewer resources would likely go into the other way and wouldn't censor anything.

"A ruling undermining Section 230 would force websites to either remove potentially controversial material or turn a blind eye to objectionable content to avoid learning about it," DeLaine Prado said. "You would be forced to choose between overly organized mainstream sites or fringe sites inundated with objectionable content."

The Supreme Court will begin hearing arguments in this case on February 21.

Google asked the court to uphold the judgment of the 9th Circuit of Appeals, which found that Section 230 does protect YouTube's recommendation engine. The Gonzalez family seeks a ruling that Section 230 immunity does not directly cover YouTube's act of recommending terrorist videos posted by third parties.

Ars was unable to immediately reach either legal team for comment.

Next: Deciding the fate of Article 230

In the court filing, Google argued that YouTube is already working to counter recruitment efforts with community guidelines that prohibit content promoting terrorist organizations.

Since 2017, Google has taken steps to remove and block the reach of all non-compliant content...

Google to SCOTUS: Liability for Promoting Terrorist Videos Will Ruin the Internet
Google to SCOTUS: Liability for Promoting Terrorist Videos Will Ruin the InternetExpand NurPhoto / Contributor | NurPhoto

For years, YouTube has been accused of enabling the recruitment of terrorists. This would happen when a user clicks on a terrorist video hosted on the platform and then descends into a rabbit hole of extremist content automatically queued "next" via YouTube's recommendation engine. In 2016, the family of Nohemi Gonzalez, who was killed in a 2015 Paris terror attack after extremists allegedly relied on YouTube for recruitment, sued YouTube owner Google, forcing courts to look into YouTube's alleged role in aiding and abetting terrorists. Google has been defending YouTube ever since. Then, last year, the Supreme Court agreed to hear the case.

Now the Gonzalez family hopes the High Court will accept that Section 230 protections designed to shield websites from liability for hosting third-party content should not be extended to also protect the rights of platforms from recommending harmful content.

Google thinks this is exactly how liability protection should work. In a court filing yesterday, Google argued that Section 230 protects YouTube's recommendation engine as a legitimate tool "intended to facilitate the communication and content of others".

"Section 230 includes sorting of content via algorithms by defining "interactive computing service" to include "tools" that "select, choose", "filter", "search, subset, organize" or "reorganize" the content", argued Google. "Congress intended to protect these functions, not simply to host third-party content."

Google claimed that denying that Section 230 protections apply to YouTube's recommendation engine would remove protections protecting all websites that use algorithms to sort and display relevant content from YouTube's recommendation engine. search at online shopping websites. This, Google warns, would trigger "devastating spillover effects" that would turn the Internet "into a disorganized mess and contentious minefield" - which is exactly what Section 230 was designed to prevent.

It seems that, according to Google, a decision against Google would turn the Internet into a dystopia where all websites and even individual users could be sued for sharing links to content deemed offensive. In a statement, Google General Counsel Halimah DeLaine Prado said such liability would lead some larger websites to excessively censor content out of extreme caution, while websites with fewer resources would likely go into the other way and wouldn't censor anything.

"A ruling undermining Section 230 would force websites to either remove potentially controversial material or turn a blind eye to objectionable content to avoid learning about it," DeLaine Prado said. "You would be forced to choose between overly organized mainstream sites or fringe sites inundated with objectionable content."

The Supreme Court will begin hearing arguments in this case on February 21.

Google asked the court to uphold the judgment of the 9th Circuit of Appeals, which found that Section 230 does protect YouTube's recommendation engine. The Gonzalez family seeks a ruling that Section 230 immunity does not directly cover YouTube's act of recommending terrorist videos posted by third parties.

Ars was unable to immediately reach either legal team for comment.

Next: Deciding the fate of Article 230

In the court filing, Google argued that YouTube is already working to counter recruitment efforts with community guidelines that prohibit content promoting terrorist organizations.

Since 2017, Google has taken steps to remove and block the reach of all non-compliant content...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow