The new system helps to combat images of child sexual abuse

0
39

[ad_1]

Every day, a a team of analysts in the UK is facing a seemingly endless mountain of horror. A team of 21 members, working in the Cambridgeshire Internet Watch Foundation office, spend hours searching for images and videos containing child sexual abuse. And every time they find a photo or a piece of footage, it needs to be evaluated and tagged. Last year alone, the team identified 153,383 websites with links to images of child sexual abuse. This creates a huge database that can then be shared internationally in an attempt to stop the flow of abuse. Problem? Different countries have different ways of categorizing images and videos.

So far, analysts at a UK-based child welfare charity have checked whether the material they find falls into three categories: either A, B or C. These groupings are based on UK laws and guidelines for sentencing children for sexual abuse and widely exposed types of abuse. Images in category A, for example, the most difficult classification, include the worst crimes against children. These classifications are then used to determine how long someone convicted of a crime should be convicted. But other countries use different classifications.

Now the IWF believes the data disclosure could remove some of these differences. The group has updated its scattering software, called Intelligrade, to automatically adjust images and videos to the rules and laws of Australia, Canada, New Zealand, the United States and the United Kingdom, also known as the Five Eyes country. The change should mean less duplication of analytical work and make it easier for technology companies to post the most serious images and videos of abuse first.

“We believe we are able to share data so that more people can use it in a meaningful way, instead of all of us just working in our own small silos,” says Chris Hughes, director of the IWF’s reporting hotline. “Right now, when we share data, it’s very difficult to get any significant comparison with the data because they just don’t fit properly.”

Countries put different weights on images based on what happens in them and the age of the children involved. Some countries classify images based on whether children are prepubertal or pubertal as well as the crime that occurs. The most serious British category A includes penetrating sexual activity, beastliness and sadism. That doesn’t necessarily include masturbation, Hughes says. While in the US this falls into a higher category. “Currently, the U.S. requesting IWF-category images would be missing at that level of content,” Hughes says.

All the photos and videos the IWF watches give a hash, basically code, that is shared with technology companies and law enforcement agencies around the world. These hashes are used to detect and block known abuse content that is being re-uploaded to the network. The hashing system has had a significant impact on the dissemination of child sexual abuse material online, but the latest IWF tool adds significantly new information to each hash.

The secret weapon of the IWF is metadata. This is data that concerns data – it can be what, who, how and when it contains what is in the pictures. Metadata is a powerful tool for researchers because it allows them to spot patterns in people’s actions and analyze them according to trends. Among the biggest proponents of metadata are spies, who say they can be more revealing than them the content of people’s messages.

The IWF has increased the amount of metadata it generates for each image and video it adds to its hash list, Hughes says. Every new image or video they watch is evaluated in more detail than ever before. In addition to working on whether the content of sexual abuse falls into the three UK groups, its analysts are now adding up to 20 different pieces of information to their reports. These fields coincide with what is needed to determine image classifications in other countries Five Eyes – The charity staff compared each law and developed the necessary metadata. “We decided to provide a high level of granulation in describing age, a high level of granulation in terms of showing what is happening in the picture, as well as confirming gender,” says Hughes.

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here