top of page
Writer's pictureElijah Ugoh

Safer: Using Technology to Remove and Report CSAM on the Internet


Safer: Using Technology to Remove and Report CSAM on the Internet
Safer: Using Technology to Remove and Report CSAM on the Internet

The internet has definitely made child sexual abuse material widespread and easily accessible. Itโ€™s almost becoming a norm to share child sexual abuse material (CSAM) on the internet. This situation strips children of their innocence and makes the internet literally unsafe for children.


A few big companies have been able to develop the technology needed to identify and remove child sexual abuse material from the internet. But many smaller businesses hosting user-generated content have issues addressing this problem. On the bright side, Thorn โ€” a nonprofit focused on using technology to combat child sexual abuse โ€” stepped up to the occasion with an amazing product called Safer.


Safer is a tool that these companies can use to identify, remove, and report child sexual abuse material (CSAM) at scale. Considering that the problem is prevalent on the internet, it is, therefore, a technology problem. In this post, weโ€™ll review how exactly Safer detection technology helps in keeping the internet free of CSAM.


What is Safer?


Today, the spread of child sexual abuse material has assumed an alarming rate globally. Focusing on companies that use user-generated content, it can be quite challenging to control the uploading and sharing of child sexual content. This is where a tool like Safer comes in. In a bid to stop this practice, Thorn came up with Safer.


In simple terms, Safer is a tech tool designed for identifying, removing, and reporting CSAM. Since 2019, Thorn has deployed Safer detection technology in many applications and results so far have been impressive.


How Does Safer Work?


Basically, Saferโ€™s detection service utilizes a process known as hashing to identify CSAM files. It scrapes the internet for documents (pictures and videos) containing child explicit content, then removes and reports such content. In addition to this, it uses a machine learning classifier to make the tool more effective.


Hashing is a technology used for a lot of things in computer science. According to Thorn, in Safer, hashing converts images and videos into a set of values that is unique to a particular file.


โ€œA hash is similar to a fingerprint โ€” just as a fingerprint can be used to identify a person even when they arenโ€™t physically present, a hash can be used to identify a file without having to actually look at it.โ€


How Do Hashes Help Safer Identify CSAM?


The National Center for Missing and Exploited Children and SaferList maintains a database of hashes, which are basically identifiers of children previously reported as missing. Each of these childrenโ€™s images often has fingerprints, which can be compared to the fingerprints of files that have already been identified as CSAM in the NCMEC and SaferList databases. Hashes of previously unknown CSAM are reported to NCMEC by Safer customers, and these are added back into the Safer ecosystem so that other customers match against them.


This system allows Safer to flag an uploaded image as CSAM or not in real-time and even at the point of upload. Safer uses two types of hashing: cryptographic and perceptual. Perceptual hashing refers to the matching of images based on how similar their fingerprints are. This allows Safer to identify altered images and determine that they are actually the same, hence making it easier for platforms to detect more content CSAM faster.


How Does a Classifier Help?


The limitation of hashing is that it can only match content already identified as CSAM. Classifiers, on the other hand, use machine learning to sort data into categories automatically. For instance, when an email enters your spam folder, thereโ€™s a classifier, which has been programmed with data to determine which emails are most likely to be spam and those that are not. Over time, the classifier is trained to identify and sort emails better.


Similarly, Safer uses this method of training classifier to help companies determine instances of new, unreported CSAMs that arenโ€™t yet on the NCMEC database. According to Thornโ€™s testing on the most recent version of Saferโ€™s classification model, 99% of files classified as child abuse content were truly CSAM.


Building a Safer Internet for Our Children


Despite the technical nature of Thornโ€™s Safer CSAM detection technology, it brings a much-needed solution to humanity. As more companies adopt this technology and other solutions like it, the internet will be safer for both children and adults alike. There are many more ways that technology can help make the internet safer against child abuse and child trafficking in general. You can check out our post on Using Technology to Combat Trafficking: Tech Solutions for Lasting Impact.


At Mission Haven โ€” a nonprofit, charity-funded organization focused on providing a comprehensive and transformational Haven of Healing for victims and survivors of child sexual exploitation and sex trafficking, we provide DMST survivors the support they need to start over.


With your generous donations and support, we can continue to provide a truly safe haven of hope and healing equipped with essential resources to lift victims and survivors of domestic minor sex trafficking. To give, volunteer, or become a partner, feel free to contact us today.


Comments


bottom of page