Concerns Rise as Child Predators Exploit Discord, a Popular Teen App, for Sextortion and Abductions

Photo: (Photo : ELLA DON on Unsplash)

Child predators are exploiting Discord, a popular app among teens, for sextortion and abductions, raising concerns about the safety of young users. Discord, initially launched as a platform for online gamers in 2015, has grown to become a hub for various communities, including those interested in cryptocurrency, YouTube gossip, and K-pop. With approximately 150 million users worldwide, Discord has gained popularity, but it also has a darker side.

Hidden within the platform's chat rooms and communities, adults have been using Discord to groom children, engage in child sexual exploitation material (CSAM) trading, and extort minors by manipulating them into sharing explicit images. According to News MD, a review of criminal complaints, news articles, and law enforcement communications since Discord's inception revealed 35 cases over the past six years involving adults who were prosecuted for charges related to kidnapping, grooming, or sexual assault, which allegedly involved communications on the platform. During or after the COVID pandemic, 22 of these cases occurred. Several of these prosecutions have led to guilty pleas or verdicts, while others are still pending.

Alarming Statistics: Reports of Child Sexual Exploitation on Discord Increase by 474%

However, these reported cases only scratch the surface, as many incidents go unreported or face significant obstacles in reporting, investigation, and prosecution. According to Stephen Sauer, the director of the tipline at the Canadian Centre for Child Protection (C3P), the cases identified are just the "tip of the iceberg."

The incidents involving Discord range from disturbing abductions to threats of violence and encouragement of self-harm. Law enforcement agencies have witnessed an increase in reports of luring and grooming cases involving Discord, with predators often moving children from platforms like Roblox or Minecraft to Discord for direct, private communication. The decentralized nature of Discord, along with its multimedia communication tools and young user base, has made it an attractive location for individuals seeking to exploit children.

Reports of CSAM on Discord have surged, with a 474% increase from 2021 to 2022, according to the National Center for Missing and Exploited Children (NCMEC). While Discord has taken steps to address child abuse and CSAM on its platform, including disabling thousands of accounts for child safety violations, challenges remain. Concerns have been raised regarding Discord's responsiveness to complaints, as the average response time increased from three days in 2021 to nearly five days in 2022.

Read Also: More than Half of Teens Experienced Emotional Abuse Since Pandemic, CDC Says

Protecting the Vulnerable: Experts Urge Tech Platforms to Prioritize Child Safety in Design and Enforcement

Discord's approach to moderation largely relies on volunteers within each community, but efforts to proactively detect known CSAM and analyze user behavior have been implemented. However, the platform faces difficulties in automatically detecting newly created CSAMs or identifying grooming behaviors.

Discord's lack of age verification systems also poses risks, as children under 13 can create accounts. The platform's safety measures mainly rely on community members flagging issues, which may not be sufficient to address the widespread problem of child exploitation. Although Discord is actively working to improve its child safety measures, watchdog organizations and officials argue that more can be done.

The cases involving Discord serve as a reminder of the ongoing challenges posed by online child exploitation and the need for robust safety measures to protect vulnerable users. As experts emphasize the importance of designing platforms with safety in mind from the outset, the spotlight remains on Discord and other tech platforms to take decisive action to address these issues and prioritize the protection of children online.

Related Article: Medical Child Abuse Perpetrators Cannot Be Criminalized in The US; Texas State Law Is About To Change This

© 2024 ParentHerald.com All rights reserved. Do not reproduce without permission.

Join the Discussion
Real Time Analytics