Skip to main content
News blog7 August 2022Directorate-General for Communication8 min read

Children deserve protection and privacy

The proposed regulation by the European Commission to prevent and combat child sexual abuse is dedicated to protecting the bodies, the future, and the privacy of children.

And it is urgently needed. After the Covid pandemic, there has been a shocking increase in child sexual abuse online and off.

Secondly, the clock is ticking, with only 2 years to go until the legislation for voluntary detection lapses.

On July 29, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) adopted a Joint Opinion on the Proposal for a Regulation to prevent and combat child sexual abuse.

Below I give a line-by-line response why the Commission proposal is well-reasoned, legally solid and entirely necessary to fight the scourge of child sexual abuse online.

What is surprising and striking, to say the least, is that the EDPB/EDPS opinion has no reference to the right to privacy of the child.

Sexual abuse of a child is a horrific act. It can destroy people’s lives, their sense of self. When images circulate online for years after the psychological effects on the person can be catastrophic. This right to not have images circulating, this right to privacy, is entirely absent in the opinion.

Our proposed legislation strikes the right balance between the various fundamental rights concerned, in particular in view of the serious nature of child sexual abuse.

Mitigating measures are a necessary first step in a system that places prevention at its core. The proposal aims to prevent further cases, not just to detect them. The mitigating measures have the fundamental aim of protecting children from further circulation of images and videos showing their abuse and from being groomed into abuse. They also allow a service provider to potentially avoid a detection order, if their infrastructure is deemed to be robust enough to lower the risk of online child sexual abuse below a significant level. 

The legislation leaves to the provider concerned, the choice of the technologies to be operated to comply effectively with detection orders, provided that the technologies meet the requirements of the Regulation. 

On the relationship with the GDPR and ePrivacy Directive, the proposed rules create a balanced and targeted system, limited to what is strictly necessary.

The proposal requires that these providers deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry.

Detection systems are only to be used for the sole purpose of detecting and reporting child sexual abuse, and strict safeguards prevent use for other purposes.

The proposal provides for judicial redress, with both providers and users having a right to challenge any measure affecting them in Court. Users have a right of compensation for any damage that might result from processing under the proposal.

On the relationship with the Interim Regulation due to expire in two years our proposal could not be clearer; “voluntary action has thus proven insufficient to address the misuse of online services for the purposes of child sexual abuse”.

The legislation respects Article 52 of the Charter of Fundamental Rights and explicitly clarifies that “the measures contained in the proposal affect, in the first place, the exercise of the fundamental rights of the users of the services at issue. Those rights include, in particular, the fundamental rights to respect for privacy (including confidentiality of communications, as part of the broader right to respect for private and family life), to protection of personal data and to freedom of expression and information. Whilst of great importance, none of these rights is absolute and they must be considered in relation to their function in society. Article 52(1) of the Charter allows limitations to be placed on the exercise of those rights, subject to the conditions set out in that provision”.

The detection order procedure has been clarified in the legislation. In contrast to today’s voluntary detection, it does not entail a general monitoring of text and audio communications of individuals. 

Only where authorities are of the opinion that there is evidence of a significant risk of misuse of a service, and that the reasons for issuing the detection order outweigh negative consequences for the rights and legitimate interests of all parties affected, would they announce their intent to consider a detection order on child sexual abuse material or grooming targeted to the specific risks identified by the service provider. 

Before any detection order is issued, the provider must be consulted. If the authorities still find there is a risk of child sexual abuse material or grooming, the provider is asked to state how it would implement detection of such material. This is in line with the need to ensure a fair balance between the fundamental rights of providers, of children and of all users.

Where a potential detection would involve high-risk processing, or in every case relating to the detection of grooming, the provider must conduct a data protection impact assessment and consult the data protection authorities.

The detection orders are limited in time, subject to strict procedural safeguards, and target a specific type of offence on a specific service. The intervention of data protection authorities is strengthened, building on the General Data Protection Regulation.

It is clear that legislation can incentivise companies to develop technical solutions. Detection for cybersecurity purposes is already taking place, such as the recognition of links in WhatsApp. 

Developments in the technological field are increasing, with a recent paper outlining solutions to detect CSAM within encrypted services (Thoughts on Child Safety on Commodity Platforms, by Dr Ian Levy and Crispin Robinson, 21 July 2022).

In July 2022, technical experts from GCHQ and the National Cyber Security Centre published a paper explaining a range of possible ways that child sexual abuse material could be detected within encrypted services that would still protect user privacy.

This is just one example, and the Commission works closely with industry, civil society organisations, and academia in the context of the EU Internet Forum, to support research that identifies further technical solutions.

In line with the central objective of the proposal to better protect children, the identification of grooming only concerns interpersonal communications where it is known that one of the users is a child.

Proportionality is ensured via the process involving all relevant stakeholders before any order is issued, by the strict conditions and safeguards, and by the involvement of a second independent authority. This does not allow for generalised and indiscriminate scanning of content of virtually all types of electronic communications.

Automated tools look for specific indicators of possible child sexual abuse, meaning they are designed to check whether specific content is likely to be child sexual abuse, but not what that content is about.

The Centre will refer providers to free technology to detect and conduct the human review of all the reports. This will alleviate the burden on the providers, especially smaller ones.

The Impact Assessment Report indicates that technologies to detect known material have an estimated false positives rate of no more than 1 in 50 billion (i.e., 0,000000002% false positive rate). 

As for the detection of new CSAM, the accuracy rate lies significantly above 90%. There are technologies for the detection of new CSAM whose precision rate can be set at 99.9% (i.e., 0,1% false positive rate, which will then be subject to human review). 

As for the detection of solicitation of children in text-based communications, this is typically based on pattern detection. Automatic detection tools have acquired a high degree of accuracy , and indicators are becoming more reliable with time as the algorithms learn, following human review. Some of the existing technologies for grooming detection (such as Microsoft’s), have an “accuracy rate” of 88%, before human review.

With the proposed legislation, high error rates would be detected quickly under the proposed rules by the EU Centre, which will ensure that no false positives reach law enforcement. Companies would be immediately notified when their tools are producing erroneous notifications, and are obliged to take steps to fix it.

 As detailed in Recital 26 of the Proposed Regulation, the proposed obligations on service providers as regards the detection of CSAM are technologically neutral, meaning they do not prescribe which technology should be used for detection as long as the chosen technology meets the high standard of conditions and safeguards to protect all users. It is an obligation of result, not of means. 

This includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. 

The Coordinating Authority may still decide to request a detection order. It shall attach the implementation plan of the provider and the opinions of the EU Centre and the data protection authority to that request.

In order to fully perform their tasks Europol and the EU Centre have to provide each other with the fullest possible access to relevant information and information systems. This will be done in accordance with the acts of Union law regulating such access. 

The above text sets out how the legislation has already taken account of issues raised by the opinion. I am proud of this proposal. It is proportionate and has checks and balances that are rigorous and fair.

What frustrates me most is that there is a risk of advocates creating opposition based only on abstract concepts. Abstract notions based on misunderstandings.

What I am concerned about is the very real, very shocking, effects of child abuse: On the child, on the adults they grow to up to be, and on society as a whole.

This legislation is the very best of what the European Union can do.

Details

Publication date
7 August 2022
Author
Directorate-General for Communication