New Research Reveals Shocking Statistics About Online Child Sexual Abusers
In a perfect world, social media would simply be entertainment. But unfortunately, this is not a perfect world, and it never will be.
The Alliance to Countering Crime published an article that began this way: “Children love social media — and so do sexual predators.”
Unfortunately, it’s a widely known fact that child sexual abuse material (CSAM) is not only prominent online, but that it’s growing exponentially. In late January, CEOs of major social media companies were confronted by the Senate Judiciary Committee about the rapidly out-of-control presence of online child sexual exploitation — to the extent that Senator Lindsey Graham (R-S.C.) said the Big Tech authorities “have blood on [their] hands.”
There are also many studies that find a strong connection between social media and declining mental health — specifically among younger generations. But the question is: Do we know how bad this sexual exploitation of minors is? Do we know anything about the offenders pouring gasoline on the fire?
In an attempt to give the debate some statistical context, Protect Children, a Finnish nonprofit dedicated to fighting child sexual violence, published a research report that, among other things, offers insight into which technological platforms offenders most often use to exploit children. In this study, CSAM included “images, videos, live-streaming, and any other material that depicts real or simulated sexual violence against a child. Every image and video causes harm to children,” the authors explained.
Notably, the research was conducted on active — yet anonymous — offenders, rather than convicted offenders currently serving sentences. They did this by creating an online survey that would show up when someone searched keywords related to CSAM. The survey was launched in 2020 and received 30,000 responses from unnamed respondents. While that number may seem significant, it pales in comparison to the 500,000 potential online sexual offenders who clicked on the survey but did not engage with the questions.
Of the 30,000 respondents, the report noted that 21 languages were represented, “with English, Spanish, and Russian comprising around 80% of the total.” It added that, “In 2023, there were a staggering 36.2 million global reports documenting suspected instances of child sexual abuse material online. Moreover, a comprehensive global study conducted by Economist Impact on behalf of WeProtect Global Alliance revealed that 54% of respondents, aged 18-20, had encountered online sexual harms during childhood.”
The study highlighted that “70% [of the respondents] were first exposed to CSAM when they were under the age of 18,” “50% were first exposed to CSAM accidentally,” and “40% have sought contact with a child after viewing CSAM.” However, in a separate category that measured those who “predominantly search for CSAM,” it found “45% search for CSAM depicting girls aged 4-13,” “18% search for CSAM depicting boys aged 4-13,” and “50% [claim to] want to stop using CSAM, however only 28% have sought help to stop.”
The report had three key findings:
- “CSAM is easily accessible on the surface web, particularly on pornography sites and social media.
- "Offenders view and share CSAM on popular social media and encrypted messaging apps.
- "Perpetrators seek contact with children on social media, encrypted messaging apps, and online games.”
Digging into the locations on the web where most offenders viewed CSAM, the report found that 32% came from porn websites, 29% from social media, 12% from regular websites, 12% from messaging apps, 6% from online commercials, and 5% from online gaming. But when it came to searching, viewing, and sharing CSAM, the results discovered that social media alone makes up 32% of those actions.
It even broke down which platforms were the most abused by offenders: Instagram 29%, Twitter/X 26%, Discord 23%, TikTok 21%, Facebook 20%, YouTube 18%, Reddit 17%, and Snapchat 10%. For which messaging apps were most abused, the statistics look like this: Telegram 46%, WhatsApp 37%, Session 13%, Wickr Me 13%, Signal 13%, Viber 7%, and Wire 6%.
Many of these messaging apps are used, the report wrote, because they largely do “not disclose any data to third parties, including governments,” and have “many privacy features that appeal to offenders such as end-to-end encryption, secret chats, self-destruct messages, editing and deleting any message from all devices after sending or receiving it, and private groups and channels.”
This 35-page research report is worth reading in full — especially for the powerful testimonies from victims of online sexual abuse. Ultimately, the findings are sad but helpful, especially for anyone who’s directly responsible for the wellbeing of children. It brings important awareness to how vast this problem is, as well as which platforms parents, educators, and other superintendent figures should be mindful of.
This report should also help lawmakers make internet safety precautions more of a political priority. In fact, Protect Children shared this report with the European Parliament in a hearing earlier this month, in which the EU lawmakers said they want to introduce new legislation that focuses on this issue.
Weighing in on the significance of these findings, the head of Research at Protect Children, Tegan Insoll, said, “Our investigation sheds light on the disturbing realities of how children are increasingly at risk of sexual abuse and exploitation online. The data we have collected is a stark reminder of the urgent need for change, and of the responsibility of tech platforms to prioritize the safety of all users.”
Simon Bailey, expert advisor to Protect Children, agreed, adding, “The tech industry has known for decades that their platforms are being used to facilitate the sexual abuse of children. This research has highlighted the scale of the abuse and shown how their design decisions have created a global epidemic of child sexual abuse.”
Hopefully, as research continues to reveal just how easily predators abuse technology, more will be done about it. In a perfect world, social media would simply be entertainment — not a dangerous breeding ground for child abusers. But unfortunately, this is not a perfect world, and it never will be. However, that doesn’t mean we should stop fighting against what is wrong and standing firm in what is right.
Sarah Holliday is a reporter at The Washington Stand.