Discord is making major changes to its child safety policies, banning teen dating and artificial intelligence-generated child sex images. The platform, known to be popular amongst gamers, came under scrutiny last month after an NBC investigation(opens in a new tab) found that child exploitation, extortion, and grooming was taking place rampantly on the site.
Now, the hub is specifically banning AI-created content that sexualizes children in any way, also doing so for text-based depictions. In a blog post(opens in a new tab), Discord said this their Child Sexual Abuse Material (CSAM) policy has expanded to include “any text or media content that sexualizes children, including drawn, photorealistic, and AI-generated photorealistic child sexual abuse material.”
They also made clear their stance on grooming and sexual conduct with minors, saying Discord has a “zero-tolerance policy” for predatory behavior that includes “online enticement” and “sextortion.”
The platform has also now explicitly banned any servers dedicated to dating amongst teens, and has said that any user under the age of 18 cannot send or access any sexually explicit material. When it comes to the dating policy, the company said that “dating online can result in self-endangerment.” Discord’s guidelines(opens in a new tab) had already stated that the company would remove any spaces that “encourage or facilitate dating between teens,” but this has been clarified further. In addition, older teens who attempt to or engage in the grooming of a younger teen will be placed under review and actioned under Discord’s Inappropriate Sexual Conduct with Children and Grooming Policy.
Discord is used by 150 million monthly active users, with 19 million servers active weekly. Children under 13 are not permitted to use Discord.
The platform said that about 15 percent of its employees, across departments, are dedicated to working on safety and improving this experience. Discord first made steps to tighten its policies and protect minors on the site two years ago(opens in a new tab), but still lacked parental control that the likes of Instagram and TikTok offer.
Discord’s newly-introduced Family Center allows parents to view their teen’s activity.
The platform has also just introduced a Family Center(opens in a new tab), letting parents have increased supervision tools and access over their child’s account – if the teen lets them. Once a teen user opts in, a parent or guardian can see who they interact with, which servers they’ve joined, and the basic information of people they’ve messages or called in DMs.
“Family Center provides parents with what they need to help guide their teen’s use of Discord without being too invasive,” said Larry Magid, CEO of ConnectSafely.org(opens in a new tab), a partner of Discord. “It’s like the physical world where you know who your kids are hanging out with and where they’re going but not listening in on their conversations or micromanaging their relationships.”
If you have experienced grooming, call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org(opens in a new tab). If you are based in the UK, contact the NSPCC by calling 0808 800 5000. To report information about child grooming, contact the National Center For Missing and Exploited Children on 1-800-843-5678(opens in a new tab). In the UK, you can report concerns about a child to the NSPCC by calling 0808 800 5000 or emailing [email protected](opens in a new tab).