Discord bans teen dating servers and AI-generated CSAM sharing

Discord has updated its policy to protect children and teens on its platform after reports surfaced that predators were using the app to create and distribute child sexual abuse material ( CSAM), as well as to prepare young adolescents. The platform now explicitly prohibits AI-generated photorealistic CSAM. As The Washington Post recently reported, the rise of generative AI has also led to an explosion of realistic images with sexual depictions of children. The post had seen conversations about the use of Midjourney – a text-to-image generative AI on Discord – to create inappropriate images of children.

In addition to banning AI-generated CSAM, Discord now explicitly bans any other type of textual or multimedia content that sexualizes children. The platform has also banned teen dating servers and pledged to take action against users who engage in this behavior. A previous investigation by NBC News revealed that Discord servers advertised as teen dating servers with participants soliciting nude images of minors.

Adult users have already been prosecuted for grooming children on Discord, and there are even criminal networks extorting underage users to post sexual images of themselves. Banning teen dating servers altogether might help alleviate the problem. Discord also included a line in its policy, which states that older teens grooming younger teens will be "reviewed and dealt with under [its] inappropriate sexual conduct with children and grooming policy."

>

In addition to updating its rules, Discord recently launched a Family Center tool that parents can use to keep an eye on their children's chat activity. Although parents cannot see the actual content of their children's message, the sign-up tool allows them to see who their children are friends with and who they talk to on the platform. Discord hopes these new measures and tools can help keep its underage users safe, along with its old measures, which include proactively scanning images uploaded to its platform using PhotoDNA.

Discord bans teen dating servers and AI-generated CSAM sharing

Discord has updated its policy to protect children and teens on its platform after reports surfaced that predators were using the app to create and distribute child sexual abuse material ( CSAM), as well as to prepare young adolescents. The platform now explicitly prohibits AI-generated photorealistic CSAM. As The Washington Post recently reported, the rise of generative AI has also led to an explosion of realistic images with sexual depictions of children. The post had seen conversations about the use of Midjourney – a text-to-image generative AI on Discord – to create inappropriate images of children.

In addition to banning AI-generated CSAM, Discord now explicitly bans any other type of textual or multimedia content that sexualizes children. The platform has also banned teen dating servers and pledged to take action against users who engage in this behavior. A previous investigation by NBC News revealed that Discord servers advertised as teen dating servers with participants soliciting nude images of minors.

Adult users have already been prosecuted for grooming children on Discord, and there are even criminal networks extorting underage users to post sexual images of themselves. Banning teen dating servers altogether might help alleviate the problem. Discord also included a line in its policy, which states that older teens grooming younger teens will be "reviewed and dealt with under [its] inappropriate sexual conduct with children and grooming policy."

>

In addition to updating its rules, Discord recently launched a Family Center tool that parents can use to keep an eye on their children's chat activity. Although parents cannot see the actual content of their children's message, the sign-up tool allows them to see who their children are friends with and who they talk to on the platform. Discord hopes these new measures and tools can help keep its underage users safe, along with its old measures, which include proactively scanning images uploaded to its platform using PhotoDNA.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow