Lawsuits: OnlyFans bribed Instagram to put creators on 'terrorist blacklist' [Updated]

Lawsuits : OnlyFans bribed Instagram to put creators on 'terrorist blacklist' [Update]Expand SOPA Images / Contributor | light flare

(Update, 5:27 p.m. ET: A spokeswoman for GIFCT explained how the "blacklist" - or more precisely, in its terms, its terrorist content database - works to record terrorist content. terrorist activities between different online platforms. It states that only videos and images are currently hashed, and nothing is automatically removed from other platforms. Instead, once the content has been hashed, each platform takes into account things like the type of terrorist entity it is or the seriousness of the content, then compares those metrics to its own rules to decide if it's eligible for content removal or warning labels.

The GIFCT spokesperson also noted that Instagram accounts are not hashed, only Instagram images and videos, and there is no "blacklisting" of users, although the GIFCT analyzes who produces the content that the organization hashes. The database records hashes for reporting terrorist entities or terrorist content based on the United Nations Security Council Sanctions List of Terrorist Entities. And all of that content stays in the database unless a platform/GIFCT member like Meta uses a GIFCT feedback tool that was introduced in 2019 to flag the content as not qualifying as terrorist content. The feedback tool can also be used to recommend edited content tags. Currently, this is the only way to challenge content that is hashed. GIFCT members also have active content moderation discussions with GIFCT's "one-stop communication mechanism." In these discussions, the spokesperson claims that none of the complaints raised in the lawsuit were mentioned by the members.

About two years ago, GIFCT became an independent non-profit organization. Since then, it has published annual transparency reports that provide insight into the feedback it receives. The next transparency report is due in December.)

Original Story: During the pandemic, OnlyFans took over the world of online adult entertainment to become a billion-dollar top dog, expected to generate five times more net revenue in 2022 than in 2020 As OnlyFans' activity grew, content creators on rival platforms complained that social media sites like Facebook and Instagram blocked their content but apparently didn't block OnlyFans with the same fervor, creating an unfair advantage. OnlyFans' growing success amid the demise of all other platforms seemed to underscore its mysterious side.

As adult performers outside the OnlyFans content stream searched for answers to their declining income, they realized that Meta had not only targeted their accounts to be banned for posting content self -saying inappropriate, but apparently also for suspected terrorist activities. The more they sought to understand why they had been branded terrorists, the more they suspected that OnlyFans was paying Meta to put the mark on their heads, resulting in account bans that went beyond Facebook and Instagram and covered popular social media apps across the internet. .< /p>

Now, Meta has been the subject of several class action lawsuits alleging senior Meta executives took bribes from OnlyFans to ban competing adult artists by placing them on a “terrorist blacklist” . Meta says the suspected scheme is "highly implausible" and that OnlyFans are more likely to beat rivals in the market through successful strategic moves, such as partnering with celebrities. However, lawyers representing three adult artists who are suing Meta say the owner of Facebook and Instagram will likely have to hand over documents to prove it.

Meta and its legal team did not immediately respond to Ars' request for comment, but in their motion to dismiss, Meta states that while "a vast and sophisticated scheme involving the manipulation of filtering systems and automated blocking" was initiated by Meta employees, Meta would not be liable. As a publisher, Meta says he is protected by the First Amendment and the Communications Decency Act to moderate content created by adult entertainment artists as he sees fit. The tech company also says it would be a...

Lawsuits: OnlyFans bribed Instagram to put creators on 'terrorist blacklist' [Updated]
Lawsuits : OnlyFans bribed Instagram to put creators on 'terrorist blacklist' [Update]Expand SOPA Images / Contributor | light flare

(Update, 5:27 p.m. ET: A spokeswoman for GIFCT explained how the "blacklist" - or more precisely, in its terms, its terrorist content database - works to record terrorist content. terrorist activities between different online platforms. It states that only videos and images are currently hashed, and nothing is automatically removed from other platforms. Instead, once the content has been hashed, each platform takes into account things like the type of terrorist entity it is or the seriousness of the content, then compares those metrics to its own rules to decide if it's eligible for content removal or warning labels.

The GIFCT spokesperson also noted that Instagram accounts are not hashed, only Instagram images and videos, and there is no "blacklisting" of users, although the GIFCT analyzes who produces the content that the organization hashes. The database records hashes for reporting terrorist entities or terrorist content based on the United Nations Security Council Sanctions List of Terrorist Entities. And all of that content stays in the database unless a platform/GIFCT member like Meta uses a GIFCT feedback tool that was introduced in 2019 to flag the content as not qualifying as terrorist content. The feedback tool can also be used to recommend edited content tags. Currently, this is the only way to challenge content that is hashed. GIFCT members also have active content moderation discussions with GIFCT's "one-stop communication mechanism." In these discussions, the spokesperson claims that none of the complaints raised in the lawsuit were mentioned by the members.

About two years ago, GIFCT became an independent non-profit organization. Since then, it has published annual transparency reports that provide insight into the feedback it receives. The next transparency report is due in December.)

Original Story: During the pandemic, OnlyFans took over the world of online adult entertainment to become a billion-dollar top dog, expected to generate five times more net revenue in 2022 than in 2020 As OnlyFans' activity grew, content creators on rival platforms complained that social media sites like Facebook and Instagram blocked their content but apparently didn't block OnlyFans with the same fervor, creating an unfair advantage. OnlyFans' growing success amid the demise of all other platforms seemed to underscore its mysterious side.

As adult performers outside the OnlyFans content stream searched for answers to their declining income, they realized that Meta had not only targeted their accounts to be banned for posting content self -saying inappropriate, but apparently also for suspected terrorist activities. The more they sought to understand why they had been branded terrorists, the more they suspected that OnlyFans was paying Meta to put the mark on their heads, resulting in account bans that went beyond Facebook and Instagram and covered popular social media apps across the internet. .< /p>

Now, Meta has been the subject of several class action lawsuits alleging senior Meta executives took bribes from OnlyFans to ban competing adult artists by placing them on a “terrorist blacklist” . Meta says the suspected scheme is "highly implausible" and that OnlyFans are more likely to beat rivals in the market through successful strategic moves, such as partnering with celebrities. However, lawyers representing three adult artists who are suing Meta say the owner of Facebook and Instagram will likely have to hand over documents to prove it.

Meta and its legal team did not immediately respond to Ars' request for comment, but in their motion to dismiss, Meta states that while "a vast and sophisticated scheme involving the manipulation of filtering systems and automated blocking" was initiated by Meta employees, Meta would not be liable. As a publisher, Meta says he is protected by the First Amendment and the Communications Decency Act to moderate content created by adult entertainment artists as he sees fit. The tech company also says it would be a...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow