FACT-checking the debate

There's a heated debate about new laws proposed by the EU. These laws will require social media and chat apps to detect when they are used to share images of child sexual abuse, remove the images, and report the offenders.

We and many others believe these rules are vital to protecting kids. However, others worry about privacy and censorship.

This is a complicated issue, which has led to the spread of a lot of misinformation. We want everyone to be able to join the debate knowing what is and is not true, so we have published a report which fact-checks the common talking points.


_______________


Just want the headlines?

Read on for a summary...


How effective is detection technology in stopping the spread of child sexual abuse material?

Detection dramatically reduces the spread of child sexual abuse material online, as part of a toolbox of solutions to tackle this complex crisis. In 2021 there was a significant reduction in detection by Internet companies because it was no longer legally required. As a result, the number of incidents of reporting went down, despite the fact that data shows the volume of abusive material increased.

 


Do these detection technologies bring risks of general mass surveillance?

No - detection technology doesn’t “read” messages. It uses technologies to flag content that is suspected to be child sexual abuse material which then undergoes a multi-step process to get verified as such, including human review. Or it uses technology that compares the digital ‘fingerprint’ of an image to a database of known child sexual abuse material. The proposed EU legislation sets out significant safeguards to ensure transparency in the use of these tools, including the input of national courts and data protection authorities.


Could these technologies wrongly flag consensually shared images or innocent pictures of children in a bathtub?

No - these tools are specifically trained to not find innocent “kid in the bathtub”-type images. They are trained on material that is known to be child sexual abuse material, which is compared with adult pornography and benign images, which means it can specifically tell the difference between them and ensure that benign images are not misinterpreted.


Will governments be able to use the technology to prosecute other behaviours?

Under the proposal, the 'EU centre to prevent and combat child sexual abuse' will provide access to accredited state of the art tech which by design can only detect child sexual abuse.

What’s more, their use will only be permitted on a case by case basis under the review of public authorities and national courts.