First step in cleaning the internet from CSAM

CSAM is widely spread, downloaded and (re)distributed via the internet, again and again on a daily basis. Every time CSAM is watched by someone, that person contributes to the online sexual exploitation of children.

CSAM depicts and involves an actual child that has been sexually exploited. At the time of the offense, victims experience physical pain and psychological effects such as emotional isolation, anxiety and stress. International research shows that the trauma maintains long after the physical exploitation has taken place with the knowledge that the material might still be available online, resulting in a variety of mental illnesses.

The Netherlands has an important role to play, because a large part of CSAM is hosted in our country. This does not say anything about the origin of the material, rather that it has been put on the internet using the Dutch infrastructure. There are currently many different actions led by the Minister of Justice and Security of The Netherlands, Ferdinand Grapperhaus, to tackle online child sexual abuse, including the Hash Check Server developed by Web-IQ. Web-IQ has been committed to supporting this fight against online child sexual abuse since the beginning and feels the urgency and responsibility to continue innovating and developing new techniques to support LEAs and other involved parties in handling the enormous task at hand.

1.4 Million

Hashes of CSAM

18 Billion

images checked in 2020

7 Million

Hits of CSAM in 2020

HashCheckServer

To prevent the uploading of known child sexual exploitation material, hosting parties needed to be part of the solution as they provide one of the gateways to publishing CSAM.

Based on this knowledge, EOKM took the initiative to help hosting parties to prevent the uploading of known CSAM and to identify existing material on their servers so that it can be deleted. Following this initiative, EOKM asked Web-IQ to develop software that allows hosting parties to check whether an image appears in the police’s database of known CSAM.

Dutch National Police has made 1.4 million hashes (unique digital fingerprints) of previously detected CSAM available to EOKM in the form of a database. By configuring the HashCheckServer with lists of hashes of known CSAM, the HashCheckServer can be used by the EOKM to offer the intended service to hosting parties.

The HashCheckServer supports matching of images by means of MD5 and SHA-1 that can be used to calculate very efficiently whether two images are identical. The HashCheckServer can thus determine whether an image uploaded by a hosting party is identical with an image from the list of known CSAM.

However, reality taught us that images are copied, forwarded, edited, adapted, or saved and published in another format. In that case, images will still look very similar but will no longer be identical and will not be matched. For that reason, PhotoDNA has also been added to the hashing techniques supported by the HashCheckServer. This also allows very similar images (which people consider to be the same) to match, even if the images are not exactly identical. This makes the HashCheckServer better able to match images and ensure that much more CSAM is detected and removed.

Every hosting party can connect to the HashCheckServer free of charge in order to check their own servers for the presence of images related to CSAM and to remove this content as a result of a ‘match’.

More information

Do you want to know more about our projects in the field of fighting CSAM or other related work, please feel free to contact us.

Contact us