Explanatory Memorandum to COM(2022)209 - Rules to prevent and combat child sexual abuse - Main contents
Please note
This page contains a limited version of this dossier in the EU Monitor.
dossier | COM(2022)209 - Rules to prevent and combat child sexual abuse. |
---|---|
source | COM(2022)209 |
date | 11-05-2022 |
1. CONTEXT OF THE PROPOSAL
• Reasons for and objectives of the proposal
The United Nations Convention on the Rights of the Child (UNCRC) and Article 24(2) of the Charter of Fundamental Rights of the European Union (‘the Charter’) 1 enshrine as rights the protection and care of children’s best interests and well-being. In 2021, the United Nations Committee on the Rights of the Child underlined that these rights must be equally protected in the digital environment 2 . The protection of children, both offline and online, is a Union priority.
At least one in five children falls victim to sexual violence during childhood 3 . A 2021 global study found that more than one in three respondents had been asked to do something sexually explicit online during their childhood, and over half had experienced a form of child sexual abuse online 4 . Children with disabilities face an even higher risk of experiencing sexual violence: up to 68% of girls and 30% of boys with intellectual or developmental disabilities will be sexually abused before their 18th birthday 5 . Child sexual abuse material is a product of the physical sexual abuse of children. Its detection and reporting is necessary to prevent its production and dissemination, and a vital means to identify and assist its victims. The pandemic has exposed children to a significantly higher degree of unwanted approaches online, including solicitation into child sexual abuse. Despite the fact that the sexual abuse and sexual exploitation of children and child sexual abuse materials are criminalised across the EU by the Child Sexual Abuse Directive 6 , adopted in 2011, it is clear that the EU is currently still failing to protect children from falling victim to child sexual abuse, and that the online dimension represents a particular challenge.
Therefore, on 24 July 2020, the European Commission adopted the EU Strategy for a More Effective Fight Against Child Sexual Abuse, 7 which sets out a comprehensive response to the growing threat of child sexual abuse both offline and online, by improving prevention, investigation, and assistance to victims. It includes eight initiatives to put in place a strong legal framework for the protection of children and facilitate a coordinated approach across the many actors involved in protecting and supporting children. These initiatives aim to identify legislative gaps and ensure that EU laws enable an effective response, strengthen law enforcement efforts at national and EU level, enable EU countries to better protect children through prevention, galvanise industry efforts to ensure the protection of children when using the services they offer, and improve protection of children globally through multi-stakeholder cooperation. This dedicated strategy is flanked by other complementary efforts. On 24 March 2021, the European Commission adopted its comprehensive EU Strategy on the Rights of the Child, which proposes reinforced measures to protect children against all forms of violence, including online abuse. In addition, it invites companies to continue their efforts to detect, report and remove illegal online content, including online child sexual abuse, from their platforms and services. The proposed European Declaration on Digital Rights and Principles for the Digital Decade 8 also includes a commitment to protect all children against illegal content, exploitation, manipulation and abuse online, and preventing the digital space from being used to commit or facilitate crimes 9 .
In this context, providers of hosting or interpersonal communication services (‘providers’) play a particularly important role. Their responsible and diligent behaviour is essential for a safe, predictable and trusted online environment and for the exercise of fundamental rights guaranteed in the Charter. The circulation of images and videos of child sexual abuse, which has increased dramatically with the development of the digital world, perpetuates the harm experienced by victims, while offenders have also found new avenues through these services to access and exploit children.
Certain providers already voluntarily use technologies to detect, report and remove online child sexual abuse on their services. Yet the measures taken by providers vary widely, with the vast majority of reports coming from a handful of providers, and a significant number take no action. The quality and relevance of reports received by EU law enforcement authorities from providers also varies considerably. Still, organisations such as the National Centre for Missing and Exploited Children (‘NCMEC’) to whom US providers are obliged to report under US law when they become aware of child sexual abuse on their services, received over 21 million reports in 2020, of which over 1 million related to EU Member States. The most recent reporting figure for 2021 shows a further increase, approaching the 30 million mark 10 .
Despite the important contribution made by certain providers, voluntary action has thus proven insufficient to address the misuse of online services for the purposes of child sexual abuse. As a consequence, several Member States have started preparing and adopting national rules to fight against online child sexual abuse. As the Impact Assessment Report accompanying this proposal demonstrates, this results in the development of divergent national requirements, in turn leading to an increase in the fragmentation of the Digital Single Market for services 11 . Against this background, uniform Union rules on the detection, reporting and removal of online child sexual abuse are necessary to complement the Digital Services Act, remove existing barriers to the Digital Single Market and prevent their proliferation. 12 Addressing the risk of fragmentation through this proposal must take account of the need to guarantee children’s fundamental rights to care and to protection of their well-being, mental health and best interest, and support the general public interest to effectively prevent, investigate and prosecute the perpetration of the serious crime of child sexual abuse.
To address these challenges and in response to calls by the Council and the European Parliament, this proposal therefore seeks to establish a clear and harmonised legal framework on preventing and combating online child sexual abuse. It seeks to provide legal certainty to providers as to their responsibilities to assess and mitigate risks and, where necessary, to detect, report and remove such abuse on their services in a manner consistent with the fundamental rights laid down in the Charter and as general principles of EU law. In combating child sexual abuse as it manifests itself online, there are important rights and interests at stake on all sides. It is therefore particularly important to establish a fair balance between measures to protect child victims of sexual abuse and their fundamental rights and thus to achieve important objectives of general societal interest, and the fundamental rights of other users and of the providers.
This proposal therefore sets out targeted measures that are proportionate to the risk of misuse of a given service for online child sexual abuse and are subject to robust conditions and safeguards. It also seeks to ensure that providers can meet their responsibilities, by establishing a European Centre to prevent and counter child sexual abuse (‘the EU Centre’) to facilitate and support implementation of this Regulation and thus help remove obstacles to the internal market, especially in connection to the obligations of providers under this Regulation to detect online child sexual abuse, report it and remove child sexual abuse material. In particular, the EU Centre will create, maintain and operate databases of indicators of online child sexual abuse that providers will be required to use to comply with the detection obligations. These databases should therefore be ready before the Regulation enters into application. To ensure that, the Commission has already made funding available to Member States to help with the preparations of these databases. The EU Centre should also carry out certain complementary tasks, such as assisting competent national authorities in the performance of their tasks under this Regulation and providing support to victims in connection to the providers’ obligations. It should also use its central position to facilitate cooperation and the exchange of information and expertise, including for the purposes of evidence-based policy-making and prevention. Prevention is a priority in the Commission’s efforts to fight against child sexual abuse.
• Consistency with existing policy provisions in the policy area
This proposal delivers on commitments made in the EU Strategy for a More Effective Fight Against Child Sexual Abuse, notably to propose legislation to tackle child sexual abuse online effectively, including by requiring providers to detect known child sexual abuse materials, and to work towards the creation of a European Centre to prevent and counter child sexual abuse. The current EU legal framework in this area consists of Union legislation relating to child sexual abuse, such as the Child Sexual Abuse Directive, and Regulation (EU) 2021/1232 on combating online child sexual abuse 13 , which applies until 3 August 2024 (‘the interim Regulation’).
By introducing an obligation for providers to detect, report, block and remove child sexual abuse material from their services, the proposal enables improved detection, investigation and prosecution of offences under the Child Sexual Abuse Directive. The proposed legislation complements the new European Strategy for a Better Internet for Children 14 , which aims to create safe digital experiences for children and to promote digital empowerment.
The EU Centre should work closely with Europol. It will receive the reports from providers, check them to avoid reporting obvious false positives and forward them to Europol as well as to national law enforcement authorities. A representative from Europol will be part of the management board of the EU Centre. In turn, a representative from the EU Centre could be part of the management board of Europol, to further ensure effective cooperation and coordination.
The proposed legislation also contributes to the achievement of the objectives set in several international law instruments. Relevant in this respect are the Council of Europe’s Lanzarote Convention on the Protection of Children against Sexual Exploitation and Sexual Abuse 15 , ratified by all EU Member States, which establishes minimum requirements regarding substantive criminal law, assistance to victims, and intervention programmes, and the Council of Europe’s Budapest Convention on Cybercrime 16 , ratified by almost all EU Member States, which requires parties to establish certain criminal offences relating to child sexual abuse material.
• Consistency with other Union policies
The proposal builds on the General Data Protection Regulation 17 (GDPR). In practice, providers tend to invoke various grounds for processing provided for in the GDPR to carry out the processing of personal data inherent in voluntary detection and reporting of child sexual abuse online. The proposal sets out a system of targeted detection orders and specifies the conditions for detection, providing greater legal certainty for those activities. As regards the mandatory detection activities involving processing of personal data, the proposal, in particular the detection orders issued on the basis thereof, thus establishes the ground for such processing referred to in Article 6(1)(c) GDPR, which provides for the processing of personal data that is necessary for compliance with a legal obligation under Union or Member State law to which the controller is subject.
The proposal covers, inter alia, providers that offer interpersonal electronic communications services and hence are subject to national provisions implementing the ePrivacy Directive 18 and its proposed revision currently in negotiations 19 . The measures set out in the proposal restrict in some respects the scope of the rights and obligations under the relevant provisions of that Directive, namely, in relation to activities that are strictly necessary to execute detection orders. In this regard, the proposal involves the application, by analogy, of Article 15(1) of that Directive.
The proposal is also coherent with the e-Commerce Directive and the Proposal for a Digital Services Act (DSA) 20 , on which provisional political agreement between the co-legislators has recently been reached 21 . In particular, the proposal lays down specific requirements for combating particular forms of illegal activities conducted and illegal content exchanged online, coupled with a set of safeguards. In that manner, it will complement the general framework provided for by the DSA, once adopted. The proposal builds on the horizontal framework of the DSA relying on it as a baseline where possible and setting out more specific rules where needed for the particular case of combating online child sexual abuse. For example, some providers may be subject to a more general obligation to assess systemic risks related to the use of their services under the DSA, and a complementary obligation to perform a specific assessment of risks of child sexual abuse online in the present proposal. Those providers can build on the more general risk assessment in performing the more specific one, and in turn, specific risks identified for children on their services pursuant to the specific risk assessment under the present proposal can inform more general mitigating measures that also serve to address obligations under the DSA.
The e-Commerce Directive and the DSA prohibit Member States from imposing on providers of intermediary services general obligations to monitor or to actively seek facts or circumstances indicating illegal activity. Whilst the precise contours of that prohibition addressed to Member States are only gradually becoming clear, the proposed Regulation aims to comply with the underlying requirement of fairly balancing the various conflicting fundamental rights at stake that underlies that prohibition, taking into account the specific context of combating online child sexual abuse and the importance of the public interest at stake. It does so, in particular, by targeting the scope of the obligations imposed on providers at risk and by setting out a set of clear and carefully balanced rules and safeguards, including through a clear definition of the objectives pursued, the type of material and activities concerned, a risk-based approach, the scope and nature of the relevant obligations, rules on redress and relevant supervision and transparency mechanisms. It also includes strong measures to facilitate and support implementation and hence reduce the burden on service providers.
In delivering on its main objectives, the proposal also helps victims. As such, the proposed Regulation is in coherence with the Victims’ Rights Directive as a horizontal instrument to improve victims’ access to their rights 22 .
2. LEGAL BASIS, SUBSIDIARITY AND PROPORTIONALITY
• Legal basis
The legal basis to support action in this area is Article 114 of the Treaty on the Functioning of the European Union (TFEU). The article provides for the establishment of measures to ensure the functioning of the Internal Market. Article 114 is the appropriate legal basis for a Regulation that seeks to harmonise the requirements imposed on providers of relevant online services in the Digital Single Market. As mentioned above, barriers to the Digital Single Market for Services have started to emerge following the introduction by some Member States of diverging national rules to prevent and combat online child sexual abuse.
The proposed Regulation seeks to eliminate those existing divergences and prevents the emergence of future obstacles which would result from the further development of such national rules. Given the intrinsic cross-border nature of the provision of online services, lack of EU action leaving space for a regulatory framework fragmented along national lines would result in a burden for providers having to comply with diverging sets of national rules and it would create unequal conditions for providers across the EU, as well as possible loopholes.
• Subsidiarity
According to the principle of subsidiarity, EU action may only be taken if the envisaged aims cannot be achieved by Member States alone, but can be better achieved at Union level.
The aim of ensuring a level playing field for providers across the Digital Single Market while taking measures to prevent and combat online child sexual abuse cannot be achieved by the Member States alone. As mentioned, Member States have started imposing requirements on providers to tackle online child sexual abuse. Even those Member States who have not yet introduced such requirements are increasingly considering national measures to that effect. However, the providers covered typically operate across borders, often on an EU-wide basis, or may wish to do so. Accordingly, national requirements imposed on such market players to address online child sexual abuse increase fragmentation in the Digital Single Market and entail significant compliance costs for providers, while being insufficiently effective by virtue of the cross-border nature of the services concerned.
Only EU level action can achieve the aim of eliminating barriers to the Digital Single Market for the services concerned, enhancing legal certainty for providers and reducing compliance costs, while at the same time ensuring that the requirements imposed on market players to tackle online child sexual abuse are effective by virtue of their uniform applicability across borders within the entire EU. Therefore, EU action is necessary to achieve the objectives of the proposed Regulation and it presents a significant added value compared to national action.
• Proportionality
This proposal aims at eliminating existing barriers to the provision of relevant services within the Digital Single Market and preventing the emergence of additional barriers, while allowing for an effective fight against online child sexual abuse in full respect of the fundamental rights under EU law of all parties affected. To achieve this objective, the proposal introduces narrowly targeted and uniform obligations of risk assessment and mitigation, complemented where necessary by orders for detection, reporting and removal of child sexual abuse content. These obligations are applicable to relevant providers offering services on the Digital Single Market regardless of where they have their principal establishment.
The proposed rules only apply to providers of certain types of online services which have proven to be vulnerable to misuse for the purpose of dissemination of child sexual abuse material or solicitation of children (known as ‘grooming’), principally by reason of their technical features or the age composition of their typical user base. The scope of the obligations is limited to what is strictly necessary to attain the objectives set out above. The obligations are accompanied by measures to minimise the burden imposed on such providers, as well as the introduction of a series of safeguards to minimise the interference with fundamental rights, most notably the right to privacy of users of the services.
To reduce the number of false positives and prevent erroneous reporting to law enforcement authorities, and to minimise the administrative and financial burden imposed on providers, among other reasons, the proposal creates the EU Centre as an essential facilitator of implementation of the obligations imposed on the providers. Among other tasks, the EU Centre should facilitate access to reliable detection technologies to providers; make available indicators created based on online child sexual abuse as verified by courts or independent administrative authorities of Member States for the purpose of detection; provide certain assistance, upon request, in connection to the performance of risk assessments; and provide support in communicating with relevant national authorities.
Finally, the proposed Regulation contains safeguards to ensure that technologies used for the purposes of detection, reporting and removal of online child sexual abuse to comply with a detection order are the least privacy-intrusive and are in accordance with the state of the art in the industry, and that they perform any necessary review on an anonymous basis and only take steps to identify any user in case potential online child sexual abuse is detected. It guarantees the fundamental right to an effective remedy in all phases of the relevant activities, from detection to removal, and it limits the preservation of removed material and related data to what is strictly necessary for certain specified purposes. Thereby, the proposed Regulation limits the interference with the right to personal data protection of users and their right to confidentiality of communications, to what is strictly necessary for the purpose of ensuring the achievement of its objectives, that is, laying down harmonised rules for effectively preventing and combating online child sexual abuse in the internal market.
• Choice of the instrument
Article 114 TFEU gives the Union’s legislator the possibility to adopt Regulations and Directives. As the proposal aims at introducing uniform obligations on providers, which usually offer their services in more than one Member State or may wish to do so, a directive leaving a margin for divergent national transposition of EU rules would not be suitable to achieve the relevant objectives. Divergent national rules transposing the requirements imposed on providers by this instrument would lead to the continuation or reintroduction of those barriers to the Digital Single Market for services that this initiative aims at eliminating.
Unlike a Directive, a Regulation ensures that the same obligations are imposed in a uniform manner across the EU. A Regulation is also directly applicable, provides clarity and greater legal certainty and avoids divergent transposition in the Member States. For these reasons, the appropriate instrument to be used to achieve the objectives of this initiative is a Regulation. In addition, in view of the date of expiry of the interim Regulation, there would in this case be insufficient time to adopt a Directive and then transpose its rules at national level.
3. RESULTS OF EX-POST EVALUATIONS, STAKEHOLDER CONSULTATIONS AND IMPACT ASSESSMENTS
• Stakeholder consultations
The Commission consulted relevant stakeholders over the course of two years to identify problems and ways forward in the fight against child sexual abuse, both online and offline. This was done through surveys, ranging from open public consultations to targeted surveys of law enforcement authorities. Multiple group expert meetings and bilateral meetings were organised between the Commission and relevant stakeholders to discuss the potential impacts of legislation in this area, and the Commission participated in relevant workshops, conferences and events on the rights of the child.
The Commission published an Inception Impact Assessment in December 2020 with the aim of informing citizens and stakeholders about the planned initiative and seeking initial feedback. This feedback showed significant support for the objective of tackling online child sexual abuse. While the holistic approach of the potential Centre and expected improvements regarding legal clarity were welcomed, some industry stakeholders expressed concerns regarding the impact of mandatory detection and reporting of online child sexual abuse.
The Commission conducted an open public consultation in 2021. This process sought to gather the views from across a broad range of stakeholders such as public authorities and private citizens, industry and civil society. Despite efforts to ensure a balanced distribution of responses, a significant proportion of contributions were received from private individuals in Germany solely addressing questions relating to the subject of encryption. That apart, issues of better cooperation and coordination, and sufficient resourcing and expertise to meet continually increasing volumes of illegal content featured prominently across public authorities, industry and civil society contributions. There was also widespread support across all groups for swift takedown of reported child sexual abuse material, for action to reduce online ‘grooming’ (solicitation of children) and for improvements to prevention efforts and assistance to victims.
Regarding the possible imposition of legal obligations on providers to detect and report various types of online child sexual abuse in their services, the consultation revealed strong support from law enforcement authorities and organisations working in the area of children’s rights, while privacy rights advocates and submissions from private individuals were largely opposed to obligations.
• Collection and use of expertise
Targeted surveys of law enforcement authorities in the Member States revealed that reports made by US providers currently constitute one of the most important sources of reports of child sexual abuse. However the quality and relevance of such reports vary, and some reports are found not to constitute online child sexual abuse under the applicable national law.
These surveys also identified the elements necessary to ensure that a report is ‘actionable’, i.e., that it is of sufficient quality and relevance that the relevant law enforcement authority can take action. It is for this reason that harmonised reports at EU level, facilitated by the EU Centre, would be the best strategy to maximise the use of expertise to counter online child sexual abuse.
• Impact assessment
Following a previous first negative opinion of the Regulatory Scrutiny Board on the Impact Assessment, in February 2022, the Regulatory Scrutiny Board issued a positive opinion on the Impact Assessment with reservations and made various suggestions for improvement. The Impact Assessment report was further revised taking into account the relevant feedback, notably by clarifying the descriptions of the measures taken to ensure compatibility with fundamental rights and with the prohibition of general monitoring obligations and by providing more detailed descriptions of the policy options. The finalised Impact Assessment report examines and compares several policy alternatives in relation to online child sexual abuse and to the possible creation of an EU Centre to prevent and combat child sexual abuse.
The Impact Assessment shows that voluntary actions alone against online child sexual abuse have proven insufficient, by virtue of their adoption by a small number providers only, of the considerable challenges encountered in the context of private-public cooperation in this field, as well as of the difficulties faced by Member States in preventing the phenomenon and guaranteeing an adequate level of assistance to victims. This situation has led to the adoption of divergent sets of measures to fight online child sexual abuse in different Member States. In the absence of Union action, legal fragmentation can be expected to develop further as Member States introduce additional measures to address the problem at national level, creating barriers to cross-border service provision on the Digital Single Market.
Given the need to address the situation and with a view to ensuring the good functioning of the Digital Single Market for services while, at the same time, improving the mechanisms for prevention, detection, reporting and removal of online child sexual abuse and ensuring adequate protection and support for victims, EU level action was found to be necessary.
Five main policy options were considered besides the baseline scenario, with increasing levels of effectiveness in addressing the objectives set out in the impact assessment and the overall policy goal of ensuring the good functioning of the Digital Single Market for services while ensuring that online child sexual abuse is detected, reported and removed throughout the Union, thereby indirectly improving prevention, facilitating investigations and guaranteeing adequate assistance to victims.
All options focused on the objective of ensuring detection, removal and reporting of previously-known and new child sexual abuse material and grooming (material scope) by relevant online service providers (personal scope) established in the EU and in third countries - insofar as they offer their services in the Union (geographical scope).
The main differences between the five options relate to the scope of the obligations on providers and the role and form of the EU Centre. Option A would consist of non-legislative, practical measures to enhance prevention, detection and reporting of online child sexual abuse, and assistance to victims. These include practical measures to increase the implementation and efficiency of voluntary measures by providers to detect and report abuse, and the creation of a European Centre on prevention and assistance to victims in the form of a coordination hub managed by the Commission.
Option B would establish an explicit legal basis for voluntary detection of online child sexual abuse, followed by mandatory reporting and removal. In the context of Option B, the EU Centre would have been tasked with facilitating detection, reporting and removal and would have become a fundamental component of the legislation, serving as a key safeguard for service providers as well as a control mechanism to help ensuring the effective implementation of the proposal. After examining several options concerning the form that the EU Centre could take, the Impact Assessment reached the conclusion that the need for independence, own resources, visibility, staff and expertise needed to perform the relevant functions would be best met by setting up the EU Centre as an EU decentralised agency. This conclusion was confirmed and strengthened in relation to Options C to E, which adopt an incremental approach, building on one another.
Options C and D, while building on Option B, would impose legal obligations on providers to detect certain types of online child sexual abuse on their services. Option C would require providers to detect known child sexual abuse material (CSAM), namely copies of material that has previously been reliably verified as constituting CSAM. Option D would require providers to detect not only ‘known’ CSAM (material confirmed to constitute child sexual abuse material), but also ‘new’ CSAM (material that potentially constitutes child sexual abuse material, but not (yet) confirmed as such by an authority).
The retained Option (Option E) builds on Option D, and requires providers to also detect grooming, in addition to known and new CSAM.
The Impact Assessment concluded that Option E is the preferred option for several reasons. Obligations to detect online child sexual abuse are preferable to dependence on voluntary actions by providers (Options A and B), not only because those actions to date have proven insufficient to effectively fight against online child sexual abuse, but also because only uniform requirements imposed at Union level are suitable to achieve the objective of avoiding the fragmentation of the Digital Single Market for services. Hence, Options A and B were discarded.
The level of the impact on the good functioning of the Digital Single Market for services and on the fight against online child sexual abuse increases progressively in line with the increasing obligations that would be imposed under each option. While an obligation to detect known CSAM (Option C) would help to reduce the recirculation of known material, such an obligation would have only a limited impact in terms of the goal of preventing abuse and providing assistance to victims of ongoing abuses, given that the material falling within the scope of such an obligation might have been in circulation for years. An obligation to detect both known and new CSAM (Option D) would allow for the identification and rescue of victims from ongoing abuse and it would do so based on uniform criteria established at EU level, thereby preventing the adoption of divergent national measures on this point. Mandatory detection also of grooming (Option E) would go further, and provide the greatest scope for preventing imminent abuse and guaranteeing a level playing field on the Digital Single Market for services.
Option E was therefore deemed to be the option which best achieves the policy objective in an effective and proportionate way, all the while ensuring proportionality through the introduction of rigorous limits and safeguards so as to ensure, in particular, the required fair balance of fundamental rights. In addition to the positive social impacts described above, the preferred option is expected to have an economic impact on the affected providers as a result of costs arising from compliance with their obligations, as well as on law enforcement authorities and other competent national authorities as a result of the increased volume of reports of potential online child sexual abuse. These are reduced as much as possible through the provision of certain support by the EU Centre.
In turn, the establishment of the Centre is also expected to incur one-off and ongoing costs. Quantitative estimates of the benefits and costs of each of the policy options were assessed in the Impact Assessment for the purposes of comparing them. The preferred option was found to lead to the greatest overall benefits, by virtue of the resulting improvement in the functioning of the Digital Single Market and reduction of the societal costs linked to online child sexual abuse.
To allow the EU Centre to achieve all of its objectives, it is of key importance that the EU Centre is established at the same location as its closest partner, Europol. The cooperation between the EU Centre and Europol will benefit from sharing location, ranging from improved data exchange possibilities to greater opportunities to create a knowledge hub on combatting CSAM by attracting specialised staff and/or external experts. This staff will also have more career opportunities without the need to change location. It would also allow the EU Centre, while being an independent entity, to rely on the support services of Europol (HR, IT including cybersecurity, building, communication). Sharing such support services is more cost efficient and ensures a more professional service than duplicating them by creating them from scratch for a relatively small entity as the EU Centre will be.
The impact assessment analysed in detail the relevant impacts, i.e. social, economic and fundamental rights. It also considered the impact on competitiveness and SMEs. The Regulation incorporates some of the measures indicated in the impact assessment in relation to SMEs. These include notably the need for the competent national authorities to take into account the size and financial and technological capabilities of the provider when enforcing the Regulation, including in relation to the risk assessment, detection obligations and penalties, as well as the possibility for SMEs to request free support from the EU Centre to conduct the risk assessment.
The impact assessment also considered the consistency with climate law, the ‘do no significant harm’ principle and the ‘digital-by-default’ principle. The impact assessment also analysed the application of the principle ‘one in, one out’ whereby each legislative proposal creating new burdens should relieve people and businesses of an equivalent existing burden at EU level in the same policy area, as well as the impacts in relation to the UN Sustainable Development Goals, where SDG 5.2 (eliminate all forms of violence against women girls) and SDG 16.2 (end abuse, exploitation, trafficking and all forms of violence against children) are particularly relevant for this Regulation.
• Fundamental rights
According to Article 52(1) of the Charter, any limitation on the exercise of the rights and freedoms recognised by the Charter must be provided for by law and respect the essence of those rights and freedoms. Subject to the principle of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others.
The proposal aims to harmonise the rules that apply to prevent and combat child sexual abuse, which is a particularly serious crime 23 . As such, the proposal pursues an objective of general interest within the meaning of Article 52(1) of the Charter 24 . In addition, the proposal seeks to protect the rights of others, namely of children. It concerns in particular their fundamental rights to human dignity and to the integrity of the person, the prohibition of inhuman or degrading treatment, as well as the rights of the child 25 . The proposal takes into account the fact that in all actions relating to children, whether taken by public authorities or private institutions, the child's best interests must be a primary consideration. Furthermore, the types of child sexual abuse at issue here – notably, the exchange of photos or videos depicting such abuse – can also affect the children’s rights to respect for private and family life and to protection of personal data 26 . In connection to combating criminal offences against minors, the Court of Justice of the EU has noted that at least some of the fundamental rights mentioned can give rise to positive obligations of the relevant public authorities, including the EU legislature, requiring them to adopt legal measures to protect the rights in question 27 .
At the same time, the measures contained in the proposal affect, in the first place, the exercise of the fundamental rights of the users of the services at issue. Those rights include, in particular, the fundamental rights to respect for privacy (including confidentiality of communications, as part of the broader right to respect for private and family life), to protection of personal data and to freedom of expression and information 28 . Whilst of great importance, none of these rights is absolute and they must be considered in relation to their function in society 29 . As indicated above, Article 52(1) of the Charter allows limitations to be placed on the exercise of those rights, subject to the conditions set out in that provision.
In addition, the freedom to conduct a business of the providers covered by the proposal comes into play as well 30 . Broadly speaking, this fundamental right precludes economic operators from being made subject to excessive burdens. It includes the freedom to choose with whom to do business and the freedom of contract. However, this right is not absolute either; it allows for a broad range of interventions that may limit the exercise of economic activities in the public interest 31 . Accordingly, the proposal seeks to achieve the abovementioned objective of general interest and to protect said fundamental rights of children, whilst ensuring proportionality and striking a fair balance between the fundamental rights of all parties involved. To that aim, the proposal contains a range of limits and safeguards, which are differentiated depending on the nature and level of the limit imposed on the exercise of the fundamental rights concerned.
Specifically, obliging detection of online child sexual abuse on both ‘public-facing’ and ‘private’ services, including interpersonal communication services, results in varying levels of intrusiveness in respect of the fundamental rights of users. In the case of material that is accessible to the public, whilst there is an intrusion, the impact especially on the right to privacy is generally smaller given the role of these services as ‘virtual public spaces’ for expression and economic transactions. The impact on the right to privacy in relation to private communications is greater.
Furthermore, the potential or actual removal of users’ material, in particular erroneous removal (on the mistaken assumption that it concerns child sexual abuse material), can potentially have a significant impact on users’ fundamental rights, especially to freedom of expression and information. At the same time, online child sexual abuse material that is not detected and left unremoved can have a significant negative impact on the aforementioned fundamental rights of the children, perpetuating harm for children and for society at large. Other factors to be taken into account in this regard include the nature of the users’ material in question (text, photos, videos), the accuracy of the technology concerned, as well as the ‘absolute’ nature of the prohibition to exchange child sexual abuse material (which is in principle not subject to any exceptions and is not context-sensitive).
As a result of the measures obliging providers to detect and report known and new child sexual abuse material, the proposal would have a significantly positive impact on the fundamental rights of victims whose images are circulating on the internet, in particular on their right to respect for private and family life, right to protection of personal data and the right to the integrity of the person.
These measures would significantly reduce the violation of victims’ rights inherent in the circulation of material depicting their abuse. These obligations, in particular the requirement to detect new child sexual abuse materials and ‘grooming’, would result in the identification of new victims and create a possibility for their rescue from ongoing abuse, leading to a significant positive impact on their rights and society at large.. The provision of a clear legal basis for the mandatory detection and reporting of ‘grooming’ would also positively impact these rights. Increased and more effective prevention efforts will also reduce the prevalence of child sexual abuse, supporting the rights of children by preventing them from being victimised. Measures to support victims in removing their images and videos would safeguard their rights to protection of private and family life (privacy) and of personal data.
As mentioned, the imposition of obligations on providers would affect their right to freedom to conduct a business, which can in principle be justified in view of the objective pursued, having regard also to the role that their services play in connection to the abuse. The impact on providers’ rights nevertheless needs to be limited to the maximum extent possible to ensure that it is does not go beyond what is strictly necessary. This would be ensured, for instance, by providing certain forms of support to providers for the implementation of the obligations imposed, including access to reliable sets of indicators of online child sexual abuse that in turn provide means to use reliable automated detection technologies, and to free-of-charge automated detection technologies, reducing the burden on them. In addition, providers benefit from being subject to a single set of clear and uniform rules.
The processing of users’ personal data for the purposes of detecting, reporting and removing online child sexual abuse has a significant impact on users’ rights and can be justified only in view of the importance of preventing and combating online child sexual abuse. As a result, the decision of whether to engage in these activities in principle cannot be left to the providers; it rather pertains to the legislator. Nonetheless, any obligations need to be narrowly targeted both in their personal and material scope and be coupled with adequate safeguards, in order not to affect the essence of the rights and to be proportionate. This proposal therefore sets out rules that correspond to these requirements, setting out limits and safeguards that are differentiated in function of the potential impact on the fundamental rights at stake, increasing generally speaking depending on the types of services concerned and whether the measures aim to detect the dissemination of known child sexual abuse material, the dissemination of new child sexual abuse material or the solicitation of children (‘grooming’).
As mentioned, detecting ‘grooming’ would have a positive impact on the fundamental rights of potential victims especially by contributing to the prevention of abuse; if swift action is taken, it may even prevent a child from suffering harm. At the same time, the detection process is generally speaking the most intrusive one for users (compared to the detection of the dissemination of known and new child sexual abuse material), since it requires automatically scanning through texts in interpersonal communications. It is important to bear in mind in this regard that such scanning is often the only possible way to detect it and that the technology used does not ‘understand’ the content of the communications but rather looks for known, pre-identified patterns that indicate potential grooming. Detection technologies have also already acquired a high degree of accuracy 32 , although human oversight and review remain necessary, and indicators of ‘grooming’ are becoming ever more reliable with time, as the algorithms learn.
Nonetheless, the interferences at stake remain highly sensitive. As a result, while robust limits and safeguards are already applied to the detection of known child sexual abuse material, they are more restrictive for new child sexual abuse materials and, especially, for the detection of ‘grooming’. These include adjusted criteria for the imposition of the detection orders, a more limited period of application of those orders and reinforced reporting requirements during that period. In addition, the proposal also sets out strong oversight mechanisms, which include requirements regarding the independence and powers of the national authorities charged with issuing the orders and overseeing their execution, as well as an assisting and advising role for the EU Centre. The EU Centre also contributes by making available not only accurate and reliable indicators, but also suitable technologies to providers, and by assessing reports of potential online child sexual abuse made by providers. In this manner it helps the EU Centre minimise the risk of erroneous detection and reporting. In addition, various measures are taken to ensure effective redress for both providers and users.
Whilst different in nature and generally speaking less intrusive, the newly created power to issue removal orders in respect of known child sexual abuse material certainly also affects fundamental rights, most notably those of the users concerned relating to freedom of expression and information. In this respect, too, a set of limits and safeguards is provided for, ranging from setting clear and standardised rules to ensuring redress and from guaranteeing the issuing authorities’ independence to transparency and effective oversight.
All references in the proposed Regulation to fundamental rights are to be understood as referring solely to the fundamental rights recognised under EU law, that is, those enshrined in the Charter and those recognised as general principles of EU law 33 .
4. BUDGETARY IMPLICATIONS
The budgetary impact of the proposal will be covered by the allocations foreseen in the Multi-annual Financial Framework (MFF) 2021-27 under the financial envelopes of the Internal Security Fund as detailed in the legislative financial statement accompanying this proposal for a regulation, to the extent that it falls within the current budgetary perspective. These implications also require reprogramming of Heading 7 of the Financial Perspective.
The legislative financial statement accompanying this proposal for a Regulation covers the budgetary impacts for the Regulation itself.
5. OTHER ELEMENTS
• Implementation plans and monitoring, evaluation and reporting arrangements
The programme for monitoring the outputs, results and impacts of the proposed Regulation is set out in its Article 83 and outlined in more detail in the Impact Assessment. The programme sets out various indicators used to monitor the achievement of operational objectives and the implementation of the Regulation.
The Commission will carry out an evaluation and submit a report to the European Parliament and the Council at the latest five years after the entry into force of the Regulation, and every six years thereafter. Based on the findings of the report, in particular on whether the Regulation leaves any gaps which are relevant in practice, and taking into account technological developments, the Commission will assess the need to adapt the scope of the Regulation. If necessary, the Commission will submit proposals to adapt the Regulation.
• Detailed explanation of the specific provisions of the proposal
The proposed Regulation consists of two main building blocks: first, it imposes on providers obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges, and, second, it establishes the EU Centre on Child Sexual Abuse as a decentralised agency to enable the implementation of the new Regulation.
Chapter I sets out general provisions, including the subject matter and scope of the Regulation (Article 1) and the definitions of key terms used in the Regulation (Article 2). The reference to ‘child sexual abuse material’ builds on the relevant terms as defined in the Child Sexual Abuse Directive, namely, child pornography and pornographic performance, and aims to encompass all of the material covered therein insofar as such material can be disseminated through the services in question (in practice, typically in the form of video and pictures). The definition is in line with the one contained in the interim Regulation. The same holds true in respect of the definition of ‘solicitation of children’ and ‘online child sexual abuse’. For the definition of several other terms, the proposal relies on definition contained in other acts of EU law or proposal, in particular the European Electronic Communications Code (EECC) 34 and the DSA proposal.
Chapter II establishes uniform obligations, applicable to all providers of hosting or interpersonal communication service offering such services in the EU’s digital single market, to perform an assessment of risks of misuse of their services for the dissemination of known or new child sexual abuse material or for the solicitation of children (together defined as ‘online child sexual abuse’). It also includes targeted obligations for certain providers to detect such abuse, to report it via the EU Centre, to remove or disable access to, or to block online child sexual abuse material when so ordered.
Section 1 creates the aforementioned risk assessment obligations for hosting or interpersonal communication service providers (Article 3). It also requires providers to adopt tailored and proportionate measures to mitigate the risks identified (Article 4) and to report on the outcome of the risk assessment and on the mitigation measures adopted to the Coordinating Authorities designated by the Member States (Article 5). Finally, it imposes targeted obligations on software application stores to assess whether any application that they intermediate is at risk of being used for the purpose of solicitation and, if this is the case and the risk is significant, take reasonable measures to identify child users and prevent them from accessing it (Article 6).
Section 2 empowers Coordinating Authorities which have become aware – through a risk assessment or other means – of evidence that a specific hosting or interpersonal communications service is at a significant risk of being misused for the purpose of online child sexual abuse to ask the competent judicial or independent administrative authority to issue an order obliging the provider concerned to detect the type of online child sexual abuse at issue on the relevant service (Articles 7 and 8). It contains a set of complementary measures, such as those ensuring that providers have a right to challenge orders received (Article 9). The section also establishes requirements and safeguards to ensure that detection is carried out effectively and, at the same time, in a balanced and proportionate manner (Article 10). Finally, it attributes to the Commission the power to adopt guidelines on the application of Articles 7 to 10 (Article 11).
Section 3 obliges providers of hosting or interpersonal communication services that have become aware, irrespective of the manner in which they have become aware, of any instance of potential online child sexual abuse on their services provided in the Union to report it immediately to the EU Centre (Article 12) and specifies the requirements that the relevant report has to fulfil (Article 13).
Section 4 empowers Coordinating Authorities to request the competent judicial or independent administrative authority to issue an order obliging a hosting service provider to remove child sexual abuse material on its services or to disable access to it in all Member States, specifying the requirements that the order has to fulfil (Article 14). Where providers detect online child sexual abuse, they are under no obligation under EU law to remove such material. Nonetheless, given the manifestly illegal nature of most online child sexual abuse and the risk of losing the liability exemption contained in the e-Commerce Directive and the DSA proposal, providers will regularly choose to remove it (or to disable access thereto). Where a provider does not remove online child sexual abuse material of its own motion, the Coordinating Authorities can compel removal by issuing an order to that effect. The article also requires providers of hosting services that have received such an order to inform the user who provided the material, subject to exceptions to prevent interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences. Other measures, such as redress, are also regulated (Article 15). The rules contained in this section have been inspired by those contained in the Terrorist Content Online Regulation (Regulation 2021/784).
Section 5 empowers Coordinating Authorities to request the competent judicial or independent administrative authority to issue an order obliging a provider of internet access services to disable access to uniform resource locators indicating specific items of child sexual abuse material that cannot reasonably be removed at source (Article 16 and 17). Article 18 ensures inter alia that providers that received such a blocking order have a right to challenge it and that users’ redress is ensured as well, including through requests for re-assessment by the Coordinating Authorities. These Articles, in combination with the provisions on reliable identification of child sexual abuse material (Article 36) and data quality (Article 46), set out conditions and safeguards for such orders, ensuring that they are effective as well as balanced and proportionate.
Section 6 lays out an exemption from liability for child sexual abuse offenses for providers of relevant information society services carrying out activities to comply with this Regulation (Article 19). This principally aims to prevent the risk of being held liable under national criminal law for conduct required under this Regulation.
Section 6 also creates specific rights for victims, whose child sexual abuse images and videos may be circulating online long after the physical abuse has ended. Article 20 gives victims of child sexual abuse a right to receive from the EU Centre, via the Coordinating Authority of their place of residence, information on reports of known child sexual abuse material depicting them. Article 21 sets out a right for victims to seek assistance from providers of hosting services concerned or, via the Coordinating Authority of their place of residence, the support of the EU Centre, when they seek to obtain the removal or disabling of access to such material.
This Section also exhaustively lists the purposes for which providers of hosting or interpersonal communication services are to preserve content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, setting out a series of safeguards and guarantees, including a maximum period of preservation of 12 months (Article 22).
Finally, it lays out the obligation for providers of relevant information society services to establish a single point of contact to facilitate direct communication with the relevant public authorities (Article 23), as well as the obligation for such providers not established in any Member State, but offering their services in the EU, to designate a legal representative in the EU, so as to facilitate enforcement (Article 24).
Chapter III contains provisions concerning the implementation and enforcement of this Regulation. Section 1 lays down provisions concerning national competent authorities, in particular Coordinating Authorities, which are the primary national authorities designated by the Member States for the consistent application of this Regulation (Article 25). Coordinating Authorities, like other designated competent authorities, are to be independent in all respects, akin to a court, and are to perform their tasks impartially, transparently and in a timely manner (Article 26).
Section 2 attributes specific investigatory and enforcement powers to Coordinating Authorities in relation to providers of relevant information society services under the jurisdiction of the Member State that designated the Coordinating Authorities (Articles 27 to 30). These provisions have mostly been inspired by the provisions in the DSA proposal. This section also provides for the power to monitor compliance with this Regulation by conducting searches of child sexual abuse material (Article 31) and to submit notices to providers of hosting services to flag the presence of known child sexual abuse material on their services (Article 32).
Section 3 includes further provisions on enforcement and penalties, by establishing that Member States of the main establishment of the provider of relevant information society services (or of its legal representative) have jurisdiction to apply and enforce this Regulation (Article 33). It also ensures that Coordinating Authorities can receive complaints against such providers for alleged breaches of their obligations laid down in this Regulation (Article 34). In addition, Member States are to lay down rules on penalties applicable to breaches of those obligations (Article 35).
Section 4 contains provisions on cooperation among Coordinating Authorities at EU level. It sets out rules on the assessment of material or conversations so as to confirm that it constitutes online child sexual abuse, which is a task reserved for Coordinating Authorities, other national independent administrative authorities or national courts, as well as for the submission of the outcomes thereof to the EU Centre for the generation of indicators or, where it concerns uniform resource locators, inclusion in the relevant list (Article 36). It also contains rules for cross-border cooperation among Coordinating Authorities (Article 37) and provides for the possibility that they undertake joint investigations, where relevant with the support of the EU Centre (Article 38). These provisions have also been inspired by the DSA proposal. Finally, this section provides for general rules on cooperation at EU level and on a reliable and secure information-sharing system to support communication among the relevant parties (Article 39).
Chapter IV concerns the EU Centre. Its provisions have been based on the Common Approach of the European Parliament, the Council and the Commission on decentralised agencies.
Section 1 establishes the EU Centre on Child Sexual Abuse (EUCSA) as a decentralised EU Centre (Article 40) and regulates the EU Centre’s legal status and its seat (Articles 41 and 42). To allow the Centre to achieve all of its objectives, it is of key importance that the EU Centre is established at the same as its closest partner, Europol. The cooperation between the EU Centre and Europol will benefit from sharing location, ranging from improved data exchange possibilities to greater opportunities to create a knowledge hub on child sexual abuse by attracting specialised staff and/or external experts. This staff will also have more career opportunities without the need to change location. It would also allow the EU Centre, while being an independent entity, to rely on the support services of Europol (HR, IT including cybersecurity, communication). Sharing such support services is more cost efficient and ensures a more professional service than duplicating them by creating them from scratch for a relatively small entity as the EU Centre will be.
Section 2 specifies the tasks of the EU Centre under this Regulation. Those include support to Coordinating Authorities, facilitation of the risk assessment, detection, reporting, removal and blocking processes, and facilitating the generation and sharing of knowledge and expertise (Article 43). The EU Centre is mandated to create and maintain databases of indicators of online child sexual abuse (Article 44) and of reports (Article 45) and to grant relevant parties such access to the databases of indicators as required, respecting the conditions and safeguards specified (Article 46). The section also empowers the Commission to adopt delegated acts supplementing this Regulation in relation to those databases (Article 47).
In addition, this section clarifies that the EU Centre is intended to act as a dedicated reporting channel for the entire EU, receiving reports on potential online child sexual abuse from all providers of hosting or interpersonal communication services issued under this Regulation, assessing them to determine whether reports may be manifestly unfounded, and forwarding the reports that are not manifestly unfounded to Europol and competent law enforcement authorities of the Member States (Article 48). Finally, this section establishes that, to facilitate the monitoring of compliance with this Regulation, the EU Centre may under certain circumstances conduct online searches for child sexual abuse material or notify such material to the providers of hosting services concerned requesting removal or disabling of access, for their voluntary consideration (Article 49). The EU Centre is also mandated to make available relevant technologies for the execution of detection orders and to act as an information and expertise hub, collecting information, conducting and supporting research and information-sharing in the area of online child sexual abuse (Article 50).
Section 3 allows the EU Centre to process personal data for the purposes of this Regulation in compliance with the rules on the processing of such data set by this Regulation and by other acts of EU law on this subject-matter (Article 51).
Section 4 establishes channels of cooperation linking the EU Centre to the Coordinating Authorities, through the designation of national contact officers (Article 52); to Europol (Article 53); and to possible partner organisations, such as the INHOPE network of hotlines for reporting child sexual abuse material (Article 54).
Section 5 sets out the administrative and management structure of the EU Centre (Article 55), establishing the composition, structure, tasks, meeting frequency and voting rules of its Management Board (Articles 56 to 60); the composition, appointment procedure, tasks and voting rules of its Executive Board (Articles 61 to 63); as well as the appointment procedure, and tasks of its Executive Director (Articles 64 and 65). In light of the technical nature and fast-paced evolution of the technologies used by providers of relevant information society services and to support the EU Centre’s involvement in the monitoring and implementation of this Regulation in this regard, this section establishes a Technology Committee within the EU Centre, composed of technical experts and performing an advisory function (Article 66).
Section 6 provides for the establishment and structure of the budget (Article 67), the financial rules applicable to the EU Centre (Article 68), the rules for the presentation, implementation and control of the EU Centre’s budget (Article 69), as well as presentation of accounts and discharge (Article 70).
Sections 7 and 8 contain closing provisions on composition and status of the EU Centre’s staff, language arrangements, transparency and communications concerning its activities, measures to combat fraud, contractual and non-contractual liability, possibility for administrative inquires, headquarters agreement and operating conditions, as well as the start of the EU Centre’s activities (Articles 71 to 82).
Chapter V sets out data collection and transparency reporting obligations. It requires the EU Centre, Coordinating Authorities and providers of hosting, interpersonal communications and internet access services to collect aggregated data relating to their activities under this Regulation and make the relevant information available to the EU Centre (Article 83), as well as to report annually on their activities to the general public and the Commission (Article 84).
Chapter VI contains the final provisions of this Regulation. Those relate to the periodic evaluation of this Regulation and of the activities of the EU Centre (Article 85); to the adoption of delegated and implementing acts in accordance with Articles 290 and 291 TFEU, respectively (Articles 86 and 87); to the repeal of the interim Regulation (Regulation 2021/1232) (Article 88) and finally to the entry into force and application of this Regulation (Article 89).