TikTok and Bumble join anti-revenge-porn initiative

TikTok and Bumble are the latest tech companies to join an initiative to reduce the spread of revenge porn: intimate images and videos shared without the subject's consent. They partnered with StopNCII.org (Stop Non-Consensual Intimate Image Abuse), which hosts a tool developed in partnership with Meta. TikTok, Bumble, Facebook and Instagram will detect and block all images included in the StopNCII.org hash bank.

The website allows users to create hashes (unique digital fingerprints) of the images and videos in question. This process takes place on their device. To protect user privacy, the actual files are not uploaded to StopNCII.org, only a unique string of letters and numbers.

Hashes submitted to StopNCII.org are shared with the initiative's partners. If an image or video uploaded to TikTok, Bumble, Facebook or Instagram matches a matching hash and "meets Partner Policy requirements", the file will be sent to the platform's moderation team. If the moderators find that the image violates their platform's rules, they will remove it. Other partner platforms will also prevent the image from being shared.

The tool has been live for a year and over 12,000 people have created cases to prevent intimate videos and images from being shared without their consent. Users have created over 40,000 hashes to date. As Bloomberg notes, Meta has partnered with SWGfL, the UK nonprofit behind Revenge Porn Helpline, to develop StopNCII.org. SWGfL hopes many more platforms will sign up.

The initiative builds on a pilot Meta (then known as Facebook) launched in Australia in 2017 that asked users to upload revenge porn images to a Messenger chat with themselves. Meta promised to delete the images after hashing them, but the approach raised obvious privacy concerns.

TikTok and Bumble join the initiative amid growing regulatory scrutiny on the former and a broader crackdown on revenge porn. The UK, for example, plans to require platforms that host user-generated content to remove non-consensual intimate images more quickly, as part of the government's online safety bill. p>

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you purchase something through one of these links, we may earn an affiliate commission. All prices correct at time of publication.

TikTok and Bumble join anti-revenge-porn initiative

TikTok and Bumble are the latest tech companies to join an initiative to reduce the spread of revenge porn: intimate images and videos shared without the subject's consent. They partnered with StopNCII.org (Stop Non-Consensual Intimate Image Abuse), which hosts a tool developed in partnership with Meta. TikTok, Bumble, Facebook and Instagram will detect and block all images included in the StopNCII.org hash bank.

The website allows users to create hashes (unique digital fingerprints) of the images and videos in question. This process takes place on their device. To protect user privacy, the actual files are not uploaded to StopNCII.org, only a unique string of letters and numbers.

Hashes submitted to StopNCII.org are shared with the initiative's partners. If an image or video uploaded to TikTok, Bumble, Facebook or Instagram matches a matching hash and "meets Partner Policy requirements", the file will be sent to the platform's moderation team. If the moderators find that the image violates their platform's rules, they will remove it. Other partner platforms will also prevent the image from being shared.

The tool has been live for a year and over 12,000 people have created cases to prevent intimate videos and images from being shared without their consent. Users have created over 40,000 hashes to date. As Bloomberg notes, Meta has partnered with SWGfL, the UK nonprofit behind Revenge Porn Helpline, to develop StopNCII.org. SWGfL hopes many more platforms will sign up.

The initiative builds on a pilot Meta (then known as Facebook) launched in Australia in 2017 that asked users to upload revenge porn images to a Messenger chat with themselves. Meta promised to delete the images after hashing them, but the approach raised obvious privacy concerns.

TikTok and Bumble join the initiative amid growing regulatory scrutiny on the former and a broader crackdown on revenge porn. The UK, for example, plans to require platforms that host user-generated content to remove non-consensual intimate images more quickly, as part of the government's online safety bill. p>

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you purchase something through one of these links, we may earn an affiliate commission. All prices correct at time of publication.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow