Today we publish our report on the temporary EU law that allows voluntarily detection of child sexual abuse by Internet companies.
This report demonstrates that detection by Internet companies helps to identify thousands of children every year, helps to secure thousands of convictions every year, helps to take down millions of abuse images and videos every year.
This report confirms that we need an EU law that allows detection of child sexual abuse in online messages.
Internet companies have been detecting abuse for more than ten years. Last year internet companies detected over 5 million images, videos and grooming attempts in the European Union alone. 70% of these reports originated in messages and chats.
Based on this detection, children suffering abuse can be identified, and therefore rescued. Like the 6 month old baby girl in the Netherlands raped by her father, rescued because the abuse was detected in Facebook messenger.
Our report is based on information received from Member States and companies as part of their reporting obligations under the temporary EU law.
This information shows that thousands of children are identified all over Europe in part thanks to online detection.
Last year alone Ireland reported 51 identified children, Bulgaria reported 62, Cyprus 102, Spain 119, The Netherlands 222 and Poland 3014.
This detection helped to obtain more than 2000 convictions of perpetrators: Sweden reported 55 convictions last year. The Netherlands reported 217 convictions and Finland 240 convictions in 2021. For recent years Denmark reported 224 convictions and France 820 convictions.
This report also shows that reporting by Member States or the providers is not always up to standard. Not all of them submitted clear or complete data. We have made it clear we need better reporting – and my permanent proposal has a full chapter dedicated to transparency.
Still overall what we can see is that detection saves children and stops perpetrators – as demonstrated by recent police actions all over Europe.
Spanish police arrested three people for sharing child sexual abuse, thanks to detection by KIK messenger. One of the men also used WhatsApp to share images and groom children.
Police in the Netherlands stopped 150 perpetrators in a raid based in part on voluntary reports. Based on voluntary reports, police in the Western Balkans rescued 2 children and stopped 39 perpetrators.
In France, a report from Snapchat sparked an investigation into 700 suspects from 73 countries who shared abuse in Signal, leading to 31 arrests and counting.
This kind of detection and police action will become impossible next August, when the temporary law expires that allows detection in online messages, and if the permanent law to prevent and fight child sexual abuse I proposed is not in place.
Reports from online messages will drop dramatically. We know this for a fact, because it has happened before. In 2021 there was legal unclarity about detection, and the number of total reports dropped by half in the EU.
We need a permanent law in place, to put prevention first and oblige companies to detect child sexual abuse when prevention fails – and to make sure all companies take their responsibility. Certainly now META is going fully encrypted.
We need a permanent law, setting up a new EU centre, ensuring companies will only be able to use verified indicators of child sexual abuse, ensuring only confirmed reports of child sexual abuse reach law enforcement. None of which is the case today.
And in the meantime, we must avoid a detection gap, a child protection gap. That’s why I have proposed to extend the temporary legislation for as long as it’s necessary to reach an agreement, to allow voluntary detection to continue.
Without this law, thousands of children all over Europe will no longer be identified, making it impossible to rescue them – year on year.
Without this law, hundreds of perpetrators can no longer be stopped – year on year.
That 6 month old baby? In future her father can continue to rape her and share the abuse without fear of detection.
Without the permanent EU law in place, we must not scrap the only alternative – an effective tool which has helped to protect children for more than ten years.
Detection will stop, but the rapes will continue. This would be a disaster for children and completely out of touch with what the European public wants. Only two per cent consider privacy more important than detection of child sexual abuse. 96% think detection is more important or equally as important.
And what about the privacy of the children? This report once again confirms the huge numbers of people sharing huge volumes of child sexual abuse online.
According to the report half of these reports are actionable – contain vital clues which help to save children: user and location information, for example.
But every single of these videos and pictures is in itself a crime – violating the bodies and privacy of children, adding trauma upon trauma.
The report shows that in 2022 META took action on over 6.6 million items of child sexual abuse. And in 2022, X (formerly Twitter) closed down 2.3 million user accounts for violating X’s child sexual exploitation policy.
Without a law in place, it will be impossible to stop the sharing of child sexual abuse in online messages, allowing the gross violation of children’s privacy on an unprecedented scale to continue.
Overall this report confirms what I have said all along: that detection online saves children, that we need detection of child sexual abuse online to continue, we need an EU law with proper safeguards that allows it.
Details
- Publication date
- 19 December 2023
- Author
- Directorate-General for Communication