You might be wondering why this is such a big deal for us. Actually, the more CSAMs are reported online, the higher the number of CSAMs that can be identified and removed from the internet. In effect, that marks the end of a child’s trauma. From previous studies, we understand that the more child sexual abuse materials we have circulating on the internet, the more endemic the problem of child sexual abuse and child sex trafficking becomes.
Sometime last year, we talked about using technology to detect, report, and remove CSAM from different platforms that rely on user-generated content (UGC). We also highlighted specific products that are leading the charge in this regard, such as Thorn’s Safer tool, which has gained increased adoption across different platforms. Last year, the CyberTipline received 29,309,106 reports of CSAM, which is a remarkable 26% increase from the 21,669,264 reports filed the previous year.
Why Are More CSAM Reports Good News?
Looking at the National Center for Missing and Exploited Children (NCMEC) recently released annual report, we see that more than 99% of the reports received by the CyberTipline in 2021 are related to CSAM. But the report also has some good news; the recent report came from 230 companies across the globe that are now using tools to detect child sexual abuse material. Compared, that’s a remarkable 21% increase. This report indicates that more companies are joining the fight against the viral online spread of child abuse material, otherwise known as child pornography. This means that the internet is a little safer for our kids.
But the work is far from done. The spread of CSAM across the internet is a global phenomenon. So in the coming months, we still expect a steep rise in the curve, as more companies deploy solutions that detect and remove CSAM from the internet. When the vast majority of platforms fully integrate tools like Safer into their systems, then we can begin to expect a downward slope.
For instance, if platforms like Twitter, Instagram, Mega, etc., record a yearly increase in CSAM reports to NCMEC, it means they are actively detecting, removing, and reporting child abuse content from their platform. And the more they do that, the better they become at detecting and removing child pornographic content from their site.
The sooner there is a universal adoption of proactive action towards removing child abuse content from the internet, the sooner our families are safer and our kids protected from unimaginable experiences. Towards the end of last year, Apple announced a set of child safety tools intended to protect young users and limit the spread of material depicting child sexual abuse.
Perhaps, the most remarkable result is the complete cessation of reports to NCMEC from Facebook messenger platform, which has contributed a tremendous amount of reports relating to child sexual abuse material in the EU in 2021. This is mainly due to the legal uncertainty surrounding the EU’s ePrivacy directive. In short, eliminating child sexual abuse material from the internet depends on collaborative efforts from everyone, including companies and individuals alike. The result we get in the end is a global online space that is safer for children.
You, Too, Can Be a Part of the Change
After an initial abuse, the CSAM of a single child victim can be circulated for years after the initial abuse. Unless reported, child sexual abuse images and videos like that can be circulated and shared online repeatedly. If you come across abusive content like that online, you can effectively end its spread to other users when you make a CyberTipline Report.
Reports made to the CyberTipline can include images, videos, and other files related to what you want to report. Even if you suspect the content to be child sexual abuse material, you can still go ahead and report it. CyberTipline usually analyzes the content along with the filed report using advanced technology.
For organizations like Facebook that generate tons of user content, deploying special tools like BAN Human Trafficking, Safer, (Un)Trafficked, and NAPTIP help in detecting, reporting, and removing child sexual abuse material from the internet. Together, we can create a safer online space for our kids, now and for future generations
Comments