YouTube may have misinformation blind spots, researchers say

The video platform said it limited the spread of misinformation ahead of Election Day, but new research has shown fake news stories continued to spread.

No, China did not work with the Democrats to steal the midterm elections, as some people on YouTube have claimed alleged. Neither did Saudi Arabia.

And there's no evidence that an "overwhelming amount of fraud" rocked Pennsylvania in 2020, or that the electronic voting machines will manipulate the results next week, as a conservative activist claimed in a video.

Ahead of the midterm elections, watchdogs misinformation say they are concerned that what has been described as an aggressive effort by YouTube to fight misinformation on the Google-owned platform has developed blind spots. They are particularly concerned about YouTube's TikTok-type service, which offers very short videos, and videos in Spanish from the platform.

But the situation is difficult to understand clearly, more than a dozen researchers said in interviews with The New York Times, because they have limited access to data and because reviewing videos is time-consuming work.

ImageJiore Craig, Manager digital integrity at the Institute for Strategic Dialogue, said it could be difficult to monitor videos for misinformation. ", like text found on Facebook or Twitter, said Jiore Craig, chief digital integrity officer for the Institute for Strategic Dialogue, or I.S.D., a nonprofit organization that fights extremism and disinformation." This puts YouTube in a situation where they get off easier."

While Facebook and Twitter are closely watched for misinformation, YouTube has often flown under the radar , despite the wide influence of the video platform. It reaches more than two billion people and hosts the second most popular search engine on the web.

YouTube has banned videos pointing to widespread fraud in the 2020 presidential election, but it hasn't established a comparable policy for midterms, a move that has drawn criticism from some bodies monitoring.

"You don't build a sprinkler system after the building is on fire," said Angelo Carusone, president of Media Matters for America, a nonprofit that monitors conservative misinformation.

A YouTube spokeswoman said the company disagreed with some of the criticism of its work to fight misinformation. . “We have invested heavily in our policies and systems to ensure that we successfully combat election-related misinformation with a multi-layered approach,” spokeswoman Ivy Choi said in a statement.

YouTube said it removed a number of videos flagged by The New York Times for violating its policies on spam and election integrity and determined that other content does not violate its policies. The company also said that between April and June, it removed 122,000 videos containing misinformation.

“Our community guidelines prohibit misleading voters wrong about how to vote, to encourage interference in democracy. process and falsely claim that the 2020 US election was rigged or stolen,” Ms. Choi said. "These policies apply worldwide, regardless of language."

YouTube has stepped up its stance against political misinformation following the 2020 presidential election. Some YouTube creators took to the platform and live-streamed the January 6, 2021 attack on the Capitol. Within 24 hours, the company began punishing people who were spreading the lie that the 2020 election was stolen and revoked President Donald J. Trump's download privileges.

YouTube commit...

YouTube may have misinformation blind spots, researchers say

The video platform said it limited the spread of misinformation ahead of Election Day, but new research has shown fake news stories continued to spread.

No, China did not work with the Democrats to steal the midterm elections, as some people on YouTube have claimed alleged. Neither did Saudi Arabia.

And there's no evidence that an "overwhelming amount of fraud" rocked Pennsylvania in 2020, or that the electronic voting machines will manipulate the results next week, as a conservative activist claimed in a video.

Ahead of the midterm elections, watchdogs misinformation say they are concerned that what has been described as an aggressive effort by YouTube to fight misinformation on the Google-owned platform has developed blind spots. They are particularly concerned about YouTube's TikTok-type service, which offers very short videos, and videos in Spanish from the platform.

But the situation is difficult to understand clearly, more than a dozen researchers said in interviews with The New York Times, because they have limited access to data and because reviewing videos is time-consuming work.

ImageJiore Craig, Manager digital integrity at the Institute for Strategic Dialogue, said it could be difficult to monitor videos for misinformation. ", like text found on Facebook or Twitter, said Jiore Craig, chief digital integrity officer for the Institute for Strategic Dialogue, or I.S.D., a nonprofit organization that fights extremism and disinformation." This puts YouTube in a situation where they get off easier."

While Facebook and Twitter are closely watched for misinformation, YouTube has often flown under the radar , despite the wide influence of the video platform. It reaches more than two billion people and hosts the second most popular search engine on the web.

YouTube has banned videos pointing to widespread fraud in the 2020 presidential election, but it hasn't established a comparable policy for midterms, a move that has drawn criticism from some bodies monitoring.

"You don't build a sprinkler system after the building is on fire," said Angelo Carusone, president of Media Matters for America, a nonprofit that monitors conservative misinformation.

A YouTube spokeswoman said the company disagreed with some of the criticism of its work to fight misinformation. . “We have invested heavily in our policies and systems to ensure that we successfully combat election-related misinformation with a multi-layered approach,” spokeswoman Ivy Choi said in a statement.

YouTube said it removed a number of videos flagged by The New York Times for violating its policies on spam and election integrity and determined that other content does not violate its policies. The company also said that between April and June, it removed 122,000 videos containing misinformation.

“Our community guidelines prohibit misleading voters wrong about how to vote, to encourage interference in democracy. process and falsely claim that the 2020 US election was rigged or stolen,” Ms. Choi said. "These policies apply worldwide, regardless of language."

YouTube has stepped up its stance against political misinformation following the 2020 presidential election. Some YouTube creators took to the platform and live-streamed the January 6, 2021 attack on the Capitol. Within 24 hours, the company began punishing people who were spreading the lie that the 2020 election was stolen and revoked President Donald J. Trump's download privileges.

YouTube commit...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow