I have just concluded two intensive days of discussions in Silicon Valley. The topic, the fight against child sexual abuse (CSA) online.
I met with the ‘big players’, like Apple, Google, Meta and Microsoft and medium size companies like Snap and Roblox. Eleven companies in all. I also met with the Tech Coalition and Thorn, important players in the fight against CSA. The meetings taught me two things; different companies have different tools and skills, and they face different challenges. But one thing is clear, we all have the same obligation: protecting children. They as service providers, me as a politician and regulator.
After discussions here, I am more convinced than ever that EU legislation is urgently needed to detect and report CSA online and remove CSAM. Detection can literally save a child’s life. The legislation will have to reflect the different technical challenges, but it must not be ‘time-capsuled’ by them. What I mean by that is that we need to set parameters that anticipate the world of 2030 and beyond with a technology-neutral and future-proof legislation.
My visit also showed me the unprecedented power, and reach, of these companies. The top four have a combined capitalisation of over 7 trillion dollars! They can move digital mountains if compelled. My job on fighting CSA is to convert technical capacity into legal obligation, even when faced with reluctance or ‘whataboutism’.
The gravity of CSA is widely understood, but the scale is not. The Council of Europe estimates that one out of five children have experienced some form of sexual violence. In 2021 there were more than 29 million reports of CSA globally (up from 21 million in 2020), which contained over 85 million images and videos (up from 65 million in 2020), and remember, this is only the incidents reported. Even more worryingly, studies in Finland and Sweden have shown the week to week experience of children. In Sweden 80% of 10-13 year old girls said that they had received unwanted sexual pictures. A recent Europol assisted operation in Germany exposed an offender ring with 400,000 members – the so called ‘Boystown’ dark web forum.
I want to share three areas that came up repeatedly with the companies:
- The dramatic increase in online grooming during the pandemic.
- The tools needed to protect children.
- The need to protect analysts who help fight against CSA and have to look at this material every day.
Grooming:
Grooming needs to be addressed with collective efforts. The situation is both complicated and urgent. Over the course of the pandemic, incidents of online grooming have skyrocketed. Refined methods to identify attempts as they are happening can prevent children being at risk.
Interaction techniques by predators are insidious, so developing tools that can identify context as well as language is key. Natural language processing is needed. The companies spoke of also developing tools of sharing so that when ‘Platform Hopping’ - the attempt to drag a victim from one platform to another, is attempted, automatic red flags are raised – between companies. Developing ‘signal sharing’ that makes this as efficient as possible is the stated goal.
Tools needed to protect:
The companies had interesting and very different perspectives when it comes to issues such as age verification. For example, there can sometimes be an attitude of legally covered minimum rather than the protective actions needed to guarantee safety. Options to simply confirm you are an adult.
The majority of the reports of CSA online concern already known images and videos, detected using tools like PhotoDNA. When Law Enforcement in the US follow up on reports of known material, they find that 50% of offenders are or have been committing physical abuse.
But the detection of newly produced images and videos and the detection of grooming is still too low, despite them being critical to rescue children from ongoing abuse or prevent the abuse from happening in the first place.
Protecting employees:
As I mentioned, analysts helping in the fight against CSA, particularly those doing human overview before sending reports, must be protected.
There is a real need to make sure that those who undertake the onerous task are very well protected. To look at such an image or video once for me would take me years to recover. For those who are trained to do so, the mental toll is heavy and therefore the help must be first class.
The ‘devil is in the detail’ is a phrase I have heard a lot during my discussion with internet companies. But what is also abundantly clear is that the devil is also in the dark.
By that I mean the offenders, the perpetrators of this horrible crime. There is an increased level of sophistication in grooming and the sharing of CSAM, and also in the production. But we have to find solutions.
One area that was universally welcomed was my commitment to set up an EU Centre to prevent and combat child sexual abuse. We have to shine a harsh light on this crime. On its method, on its scale and on its heartbreaking effects. The offenders’ modus operandi is to operate in the dark. Companies must be compelled to make that as difficult to do as possible. EU law will lead the way. Regulation will provide the legal clarity and the obligations that are needed.
It is in the interest of children, it’s in the interest of society and it’s in the interest of the companies themselves.
When it comes to preventing and combatting child sexual abuse, the debates on privacy, on corporate tools, have many lobbyists, many interest but the least well represented voice is that of the child.
We need European legislation to better protect children from this horrific crime.
Details
- Publication date
- 30 January 2022
- Author
- Directorate-General for Communication