Technology of the Internet Watch Foundation Identifies Child Sex Abuse Imagery

This is a guest blog authored by Sarah Smith, IWF Technical Officer

“Stay out front or fall behind,” the saying goes, and at the Internet Watch Foundation (IWF), being ahead is what we aim to do. As offenders abuse legitimate internet platforms to help them share images of the horrific crime of child sexual abuse, we work to develop, test and implement refined technology to proactively identify and remove these disturbing images and videos. It’s a massive task, and we never forget that the victims are at the heart of our work.

The IWF Hotline remains perpetually vigilant, not only following up on thousands of reports from members of the public, who may have stumbled across online child sexual abuse imagery, but also performing our uniquely proactive role of searching for this illegal content on the web. We use innovation and work with some of the giants of the internet industry to evolve cutting-edge preventative services and identify abusive images online via technology.

In 1996 when the IWF started, the UK hosted a shocking 18% of the internet’s known child sexual abuse imagery. Due to the work of our analysts, and the technology we have in place, that figure has plummeted to less than 1%. That’s a record we’re very proud of. While our partners work with us to make the internet a safer place, the work of staying one technological step ahead of the offenders continues.

In 2017 we created the IWF Image Hash List, which is a list of digital fingerprints, or ‘hashes’, of known child sexual abuse images. These hashes were made up of imagery already assessed by our analysts, and this list has grown into a library of more than 295,000 illegal images, that is constantly being updated. The IWF Image Hash List is a frontline resource for IWF members and is used proactively to allow known illegal digital fingerprinted images to be blocked from legitimate services. That means that their users can not only be protected online, but they can also block the uploading or sharing of these images online. The hashed images can be taken off the web and the IWF Image Hash List offers victims some comfort that there is a higher chance that these images will be removed and not duplicated.

This year we went even further when we announced a project, funded by Microsoft and using PhotoDNA technology. We developed a tool which enabled analysts to create digital fingerprints of videos. This breakthrough lies in a complex area and has helped us to identify and remove more child sexual abuse illegal video material. It is in its early days, but this new technology is yet another step towards eliminating all online child sex abuse imagery, which is our goal.

Today the search continues as the IWF works harder with leading partners to develop and trial prototypes designed to target child sexual abuse images and videos in new ways, so that more can be identified and taken down.  Part of this pioneering work provides our analysts with the potential to detect and assess previously unlogged images, expanding the Hotline database. We also share our knowledge with partners, including law enforcement, and work with them to help identify any potential new victims.

Our team works closely with members, like Google, Microsoft and Facebook, to constantly push technical boundaries. Staying ahead of offenders is the key to our work. And for that reason, our developmental successes are often deployed without a fanfare, in order to disrupt more criminal online activity, to help law enforcement as effectively as possible and to help rescue more victims from insidious re-abuse.

We are fighting for all child victims of online sexual abuse, so we strive to stay steps ahead by using cutting-edge technology as a weapon.