Progress: Awaiting Parliament's position in 1st reading
Role | Committee | Rapporteur | Shadows |
---|---|---|---|
Lead | LIBE | ||
Former Responsible Committee | LIBE | ZARZALEJOS Javier ( EPP) | |
Committee Opinion | BUDG | ||
Committee Opinion | IMCO | ||
Committee Opinion | CULT | ||
Committee Opinion | FEMM | ||
Former Committee Opinion | IMCO | AGIUS SALIBA Alex ( S&D) | Marcel KOLAJA ( Verts/ALE), Jean-Lin LACAPELLE ( ID), Catharina RINZEMA ( RE) |
Former Committee Opinion | BUDG | HERBST Niclas ( EPP) | Nils TORVALDS ( RE), Silvia MODIG ( GUE/NGL) |
Former Committee Opinion | CULT | KIZILYÜREK Niyazi ( GUE/NGL) | Asim ADEMOV ( PPE), Marcel KOLAJA ( Verts/ALE), Lucia ĎURIŠ NICHOLSONOVÁ ( RE), Andrey SLABAKOV ( ECR) |
Former Committee Opinion | FEMM | FRITZON Heléne ( S&D) | Sandra PEREIRA ( GUE/NGL), Pierrette HERZBERGER-FOFANA ( Verts/ALE), Karen MELCHIOR ( RE), Eleni STAVROU ( PPE) |
Lead committee dossier:
Legal Basis:
RoP 57_o, TFEU 114
Legal Basis:
RoP 57_o, TFEU 114Subjects
Events
The Committee on Civil Liberties, Justice and Home Affairs adopted a report by Javier ZARZALEJOS (EPP, ES) on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse.
The committee responsible recommended that the European Parliament's position adopted at first reading under the ordinary legislative procedure should amend the proposal as follows:
Subject matter and scope
The proposed Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse, in order to contribute to the proper functioning of the internal market and to create a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter are effectively protected. It establishes, inter alia, obligations on providers of online games.
It should not apply to audio communications.
Detection obligations
Concerning detection orders and its consequent detection obligations, Members considered that they should cover not only the dissemination of material previously detected and confirmed as constituting child sexual abuse material (known material), but also material not previously detected that is likely to constitute child sexual abuse material but has not yet been confirmed as such (new material), as well as activities constituting the solicitation of children (grooming).
In the adopted text, Members excluded end-to-end encryption from the scope of the detection orders to guarantee that all users’ communications are secure and confidential. Providers would be able to choose which technologies to use as long as they comply with the strong safeguards foreseen in the law, and subject to an independent, public audit of these technologies.
In order to stress detection orders as a mechanism of last resort , Members proposed reinforcing prevention as part of the mitigation measures to be taken by relevant society communication services. Mitigation measures may include targeted measures to protect the rights of the child, including safety and security design for children by default, functionalities enabling age assurance and age scoring, age-appropriate parental control tools, allowing flagging and/or notifying mechanisms, self-reporting functionalities, or participating in codes of conduct for protecting children.
Detection orders should contain information about the right to appeal to a court of law according to the national legislation.
Reporting obligations
Providers of hosting services and providers of number-independent interpersonal communication services should establish and operate an easy to access, age-appropriate, child-friendly and user-friendly mechanism that allows any users or entity to flag or notify them of the presence on their service of specific items of information that the individual or entity considers to be potential online child sexual abuse, including self-generated material.
EU centre for child protection
Under the amended text, the European Union Agency to prevent and combat child sexual abuse, the EU Centre for child protection, is established. It should gather and share anonymised information, gender-, and age-disaggregated statistics, and expertise, educational materials and best practices and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online. It should promote and ensure the appropriate support and assistance to victims.
Victims’ Rights and Survivors Consultative Forum
Members proposed to create a Victim’s Rights and Survivors Consultative Forum to make sure that victims’ voices are heard.
Establishment of an online European Child Protection Platform
Members proposed that the EU Centre should create, maintain and operate an online platform for the presentation of information about Member States hotlines and helplines ('Child Protection Platform'). That platform may also be used for the promotion of awareness-raising and prevention campaigns. The platform should be accessible 24 hours a day and seven days a week in all Union languages and shall be child-friendly, age-appropriate and accessible.
Seat
The choice of the location of the seat of the EU Centre should be made in accordance with the ordinary legislative procedure, based on specific criteria. The Commission had initially proposed the Netherlands.
Review
Within three years from the entry into force of the Regulation, the Commission should submit a report to the European Parliament and to the Council on the necessity and feasibility of including the solicitation of children in the scope of the detection orders, taking into account in particular the reliability and accuracy of the state of art of the detection technologies. Where appropriate, the report should be accompanied by legislative proposals.
PURPOSE: to set out a clear and harmonised legal framework on preventing and combating child sexual abuse.
PROPOSED ACT: Regulation of the European Parliament and of the Council.
ROLE OF THE EUROPEAN PARLIAMENT: the European Parliament decides in accordance with the ordinary legislative procedure and on an equal footing with the Council.
BACKGROUND: information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well-being, as is required under the Charter of Fundamental Rights of the European Union, and to protect society at large.
In the absence of harmonised rules at EU level, social media platforms, gaming services, other hosting and online service providers face divergent rules. Certain providers voluntarily use technology to detect, report and remove child sexual abuse material on their services. Measures taken, however, vary widely and voluntary action has proven insufficient to address the issue.
The protection of children, both offline and online, is a Union priority.
CONTENT: in order to address the abovementioned challenges, the Commission proposed to establish a clear and harmonised legal framework on preventing and combating online child sexual abuse . It seeks to provide legal certainty to providers as to their responsibilities to assess and mitigate risks and, where necessary, to detect, report and remove such abuse on their services in a manner consistent with the fundamental rights laid down in the Charter and as general principles of EU law.
This proposal therefore lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market. It establishes, in particular:
An EU Centre
The proposal seeks to establish the EU Centre on Child Sexual Abuse (EUCSA) as a decentralised agency to enable the implementation of the new Regulation. It aims to help remove obstacles to the internal market, especially in connection to the obligations of providers under this Regulation to detect online child sexual abuse, report it and remove child sexual abuse material. The Centre will create, maintain and operate databases of indicators of online child sexual abuse that providers will be required to use to comply with the detection obligations. These databases should therefore be ready before the Regulation enters into application. To ensure that, the Commission has already made funding available to Member States to help with the preparations of these databases.
Mandatory risk assessment and risk mitigation measures
Providers of hosting or interpersonal communication services will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.
Targeted detection obligations, based on a detection order
Member States will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new child sexual abuse material or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.
Strong safeguards on detection
Companies having received a detection order will only be able to detect content using indicators of child sexual abuse verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.
Clear reporting obligations
The proposal obliges providers that have detected online child sexual abuse to report it to the EU Centre.
Effective removal
National authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.
Reducing exposure to grooming
The rules require software application stores to ensure that children cannot download applications that may expose them to a high risk of solicitation of children.
Solid oversight mechanisms and judicial redress
Detection orders will be issued by courts or independent national authorities. To minimise the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.
Documents
- Committee report tabled for plenary, 1st reading: A9-0364/2023
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Document attached to the procedure: EUR-Lex
- Document attached to the procedure: SEC(2022)0209
- Document attached to the procedure: EUR-Lex
- Document attached to the procedure: SWD(2022)0209
- Document attached to the procedure: EUR-Lex
- Document attached to the procedure: SWD(2022)0210
- Legislative proposal published: COM(2022)0209
- Legislative proposal published: EUR-Lex
- Document attached to the procedure: EUR-Lex SEC(2022)0209
- Document attached to the procedure: EUR-Lex SWD(2022)0209
- Document attached to the procedure: EUR-Lex SWD(2022)0210
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
Amendments | Dossier |
2798 |
2022/0155(COD)
2022/11/30
CULT
124 amendments...
Amendment 100 #
Proposal for a regulation Article 6 a (new) Article 6 a Anonymous public reporting of online child sexual abuse 1. Member States shall take appropriate measures to promote and safeguard the role of formally recognized non- governmental organizations involved in anonymous public reporting of child sexual abuse material and the proactive search for such material. 2. Member States shall ensure that the public always has the possibility to anonymously report child sexual abuse material and child sexual exploitation activities to hotlines specialised in combatting online child sexual abuse material and shall safeguard the role of such hotlines in anonymous public reporting. 3. Member States shall ensure that the hotlines referred to in paragraph 2 operating in their territory are authorised to view, assess and process anonymous reports of child sexual abuse material. 4. Member States shall grant the hotlines referred to in paragraph 2 the authority to issue content removal notices for confirmed instances of child sexual abuse material. 5. Member States shall authorise the hotlines referred to in paragraph 2 to voluntarily conduct pro-active searching for child sexual abuse material online.
Amendment 101 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation or by the report submitted by the recognised hotline, which results in its voluntary and timely removal, of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
Amendment 102 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services, hotlines and organisations acting solely in the public interest against child sexual abuse shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements.
Amendment 103 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements, with the exception of subsequent non-cooperation with the judicial authorities.
Amendment 104 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they reside, information regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12.
Amendment 105 #
Proposal for a regulation Article 21 – title Victims’ right of assistance and support
Amendment 106 #
Proposal for a regulation Article 21 – paragraph 1 1. The providers of very large online platforms that have identified the risk of use of their service for the purpose of online child sexual abuse in line with Article 3 shall provide reasonable assistance, on request, to persons residing in the Union that seek to report potential abuse, by putting in place reporting functions in a prominent way on their platform. Such providers shall ensure adequate follow-up, when a report or alert is made, in the language that the user has chosen for their service. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 107 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider complemented in a timely matter and, if requested and appropriate, also included in the list of indicators used to prevent the further dissemination of these items.
Amendment 108 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, in a timely manner, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 109 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 2 In this regard, a special green line with a call centre assistance service will be established, in order for victims and their families to receive support in a timely manner. That Coordinating Authority shall transmit the request to the EU Centre through the system established in accordance with Article 39(2) and shall communicate the results received from the EU Centre to the person making the request.
Amendment 110 #
Proposal for a regulation Article 21 – paragraph 4 a (new) 4 a. Member States shall establish and improve the functioning of child helpline and missing children hotline, including through funding and capacity building, in line with Article 96 of Directive (EU) 2018/1972.
Amendment 111 #
4 b. Member States shall ensure that law enforcement authorities have adequate technical, financial and human resources to carry out their tasks, including for the purpose of identification of victims.
Amendment 112 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 2 The Coordinating Authority shall be responsible for all matters related to application and enforcement of this Regulation in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities. The Coordinating Authority shall also be responsible for the coordination and adaptation of prevention techniques, elaborated by the EU Centre. The Coordinating Authority shall generate recommendations and good practices on improving digital literacy and skills amongst the population trough the realization of awareness campaigns on a national level, targeting in particular parents and children on the detection and prevention of child sexual abuse online.
Amendment 113 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 2 The Coordinating Authority shall be responsible for all matters related to the application and enforcement of this Regulation, and to the achievement of the objective of this regulation in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities.
Amendment 114 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 3 The Coordinating Authority shall in any event be responsible for ensuring coordination at national level in respect of those matters, including matters related to prevention and for contributing to the effective, efficient and consistent application and enforcement of this Regulation throughout the Union.
Amendment 115 #
Proposal for a regulation Article 25 – paragraph 3 3.
Amendment 116 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available
Amendment 117 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to coordinate prevention within the Member State and to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and communicate it to the EU Centre. They shall keep that information updated.
Amendment 118 #
Proposal for a regulation Article 25 – paragraph 7 – point d a (new) (d a) provide knowledge and experience on appropriate prevention techniques on grooming and the detection and dissemination of CSAM online;
Amendment 119 #
Proposal for a regulation Article 25 a (new) Article 25 a Cooperation with partner organisations Where necessary for the performance of its tasks under this Regulation, including the achievement of the objective of this Regulation, and in order to promote the generation and sharing of knowledge in line with Article 43 (6), the Coordinating Authority shall cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations and semi-public organisations and practitioners.
Amendment 120 #
Proposal for a regulation Article 26 – paragraph 2 – point c (c) are free from any undue external influence, whether direct or indirect; it being understood that the membership of the Coordinating Authority in a recognised international network shall not prejudice its independent character;
Amendment 121 #
Proposal for a regulation Article 26 – paragraph 4 4. The Coordinating Authorities shall ensure that
Amendment 122 #
Proposal for a regulation Article 34 – paragraph 2 2. Coordinating Authorities shall also provide child
Amendment 123 #
Proposal for a regulation Article 39 – paragraph 1 1. Coordinating Authorities shall cooperate with each other, with national hotlines and any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and other relevant Union agencies, including Europol, to facilitate the performance of their respective tasks under this Regulation and ensure its effective, efficient and consistent application and enforcement. Coordinating Authorities shall exchange information and best practices on preventing and combatting grooming and child sexual abuse online.
Amendment 124 #
Proposal for a regulation Article 40 – paragraph 2 2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information, educational materials and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
Amendment 125 #
Proposal for a regulation Article 40 – paragraph 2 2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information, good practices and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
Amendment 126 #
Proposal for a regulation Article 40 – paragraph 2 a (new) 2 a. The EU Centre shall elaborate appropriate prevention techniques on grooming and child sexual abuse online, based on its knowledge, expertise and achievements, in close cooperation with relevant stakeholders and in line with the Communication of the Commission of 11 May “A Digital Decade for children and youth: the new European strategy for a better internet for kids" (BIK+).
Amendment 127 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – introductory part (6) facilitate the generation and sharing of knowledge with other Union institutions, bodies, offices and agencies, organisations acting in the public interest against child sexual abuse and hotlines, Coordinating Authorities or other relevant authorities of the Member States to contribute to the achievement of the objective of this Regulation, by:
Amendment 128 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a (a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51, including in view of updating guidelines on prevention and mitigation methods for combatting child sexual abuse, especially for the digital dimension as per new technological developments;
Amendment 129 #
(a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51 , including education and awareness raising programmes, and intervention programmes;
Amendment 130 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a a (new) (a a) gathering information about awareness and prevention campaigns carried out in the different Member States, as well as good practices carried out by public and private bodies, stakeholders and education systems and centres;
Amendment 131 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research and expertise on
Amendment 132 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research, educational materials and expertise on those matters and on assistance to victims, including by serving as a hub of expertise to support evidence-based policy;
Amendment 133 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (b a) contribute to the implementation of awareness campaigns as per the potential risks posed by the online environment to children, in order to equip them with adequate skills for detecting potential grooming and deceit, to ensure safe use of the internet by children and to better implement the prevention component of online child sexual abuse;
Amendment 134 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (b a) promoting age-differentiated awareness-raising campaigns in schools and information campaigns for parents, teachers and pupils;
Amendment 135 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) (b b) assisting with expertise and knowledge in the development and implementation of teacher training across the Union, in order to equip teachers with the necessary skills for guiding children on safely using information society services and detecting potentially malicious behaviour online;
Amendment 136 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) Amendment 137 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 a (new) (6 a) supporting and promoting the regular exchange of best practices and lessons learned among Member States on raising awareness for the prevention of child sexual abuse, prevention programmes and non-formal and formal education on the risks of sexual abuse in the digital environment;
Amendment 138 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 b (new) (6 b) provide assistance with training on prevention of child sexual abuse online for officials from Member States;
Amendment 139 #
8 a. Where the EU Centre receives a report from a hotline, or where a provider that submitted the report to the EU Centre has indicated that the report is based on the information received from a hotline, the EU Centre shall refrain from forwarding the report to the competent law enforcement authority or authorities to avoid duplicated reporting on the same material that has already been reported to the national law enforcement by the hotlines, and shall monitor the removal of the child sexual abuse material or cooperate with the relevant hotline to track the status.
Amendment 140 #
Proposal for a regulation Article 50 – paragraph 3 3. Where necessary for the performance of its tasks under this Regulation, the EU Centre shall carry out, participate in or encourage research, surveys and studies, either on its own initiative or, where appropriate and compatible with its priorities and its annual work programme, at the request of the European Parliament, the Council or the Commission. The collected knowledge (resulting from research, surveys and studies) shall serve as a tool to elaborate prevention techniques on child sexual abuse online to be adapted and implemented by Coordinating Authorities in each Member State.
Amendment 141 #
Proposal for a regulation Article 50 – paragraph 3 3.
Amendment 142 #
Proposal for a regulation Article 50 – paragraph 4 4. The EU Centre shall provide the information referred to in paragraph 2 and the information resulting from the research, surveys and studies referred to in paragraph 3, including its analysis thereof, and its opinions on matters related to the prevention and combating of online child sexual abuse to other Union institutions, bodies, offices and agencies, Coordinating Authorities, Hotlines, other competent authorities
Amendment 143 #
Proposal for a regulation Article 50 – paragraph 5 5. The EU Centre shall develop prevention techniques on the detection of suspicious content and behavior online and shall communicate it to Coordinating Authorities of each Member State, so they could adapt and initiate measures to improve digital literacy and raise awareness amongst parents and educators of the existing digital tools to insure a safe digital environment for children. The EU Centre shall also establish a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness
Amendment 144 #
Proposal for a regulation Article 50 – paragraph 5 a (new) 5 a. The EU Centre should develop ambitious campaigns tailored for all age ranges, taking into account that they should reach out to young children, adolescents, parents, teachers and society at large. They should also take into account people with disabilities, who may be more vulnerable as they may not have full access to this information.
Amendment 145 #
Proposal for a regulation Article 54 – paragraph 1 1. Where necessary for the performance of its tasks under this Regulation, the EU Centre
Amendment 146 #
Proposal for a regulation Article 54 – paragraph 2 2. The EU Centre may conclude
Amendment 147 #
Proposal for a regulation Article 83 a (new) Article 83 a Data collection on prevention programmes Member States shall report on the anticipated number of children in primary education who have been informed through the awareness campaigns and through the education programmes about the risks of all forms of sexual exploitation of children, including in the online environment.
Amendment 148 #
Proposal for a regulation Article 85 – paragraph 1 1. By [
Amendment 149 #
Proposal for a regulation Article 85 – paragraph 2 2. By [
Amendment 26 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Digital services have become an irreplaceable tool for today’s children, as information, elements of formal education, social contact and entertainment are increasingly online; whereas digital services can also expose children to risks such as unsuitable content, grooming, and child sexual abuse. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), and to protect society at large. Users of such services offered in the Union should be able to trust that the services concerned can be used safely, especially by children. In order to ensure a safer online experience for children and prevent the above-mentioned offences, digital literacy should be recognized as a mandatory skill by Member States and should be included in the school curriculum across the EU.
Amendment 27 #
(1 a) The role of prevention should be emphasised by equipping children, parents and caregivers with the necessary instruments in order to develop situational awareness of the online environment, evaluate potential risks and support children in being safe online. In this regard, education facilities should have a greater role in contributing to this scope, reason for which civic education classes should also provide for the attainment of safe internet skills for children.
Amendment 28 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by appropriate prevention techniques, improving digital literacy, and ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to
Amendment 29 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services. To this end, fundamental importance should be attached to ensuring the necessary funding to European programmes and projects which aim to improve digital skills and awareness of risk linked to the digital world, such as “Media literacy for all”.
Amendment 30 #
Proposal for a regulation Recital 2 a (new) (2 a) For the purposes of this regulation, “digital skills” should be understood as skills relating to the web as a whole, consisting of both easily accessible surface web platforms and platforms accessible through the deep and dark web. The EU must therefore provide for effective awareness of the dangers also lurking in the deep and dark web.
Amendment 31 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, have a direct negative effect on the internal market and lead to a fragmentation in the Union’s approach towards this phenomenon. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market,
Amendment 32 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements and appropriate prevention techniques should be laid down at Union level.
Amendment 33 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should
Amendment 34 #
Proposal for a regulation Recital 4 a (new) (4 a) To insure full application of the objectives of this Regulation, Member States shall implement prevention strategies and awareness campaigns in their school curriculum and inside educational institutions. Taking into account the data collected by the EU Centre, Coordinating Authorities, relevant law enforcement agencies and existing hotlines across the EU, Member States should elaborate prevention techniques improving digital literacy, by educating children on how to safely surf online and how to recognize signals of cyber grooming. Prevention techniques and awareness campaigns should also target parents. Parents and caregivers shall be informed of the existence and the functioning of digital tools to limit and direct their child’s/children’s experience online and limit access to age- inappropriate or harmful content online.
Amendment 35 #
Proposal for a regulation Recital 6 (6) Online child sexual abuse frequently involves the misuse of information society services offered in the Union by providers established in third countries.
Amendment 36 #
Proposal for a regulation Recital 11 (11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards
Amendment 37 #
Proposal for a regulation Recital 11 a (new) (11 a) The UN Study on Violence against Children defines "child sexual abuse" as any type of sexual activity inflicted on children, especially by someone who is responsible for them, or who has power or control over them, and whom they should be able to trust. Sexual violence against children encompasses a wide range of acts, such as forced sexual intercourse in intimate partner relationships, rape by strangers, systematic rape, sexual harassment (including demanding sex in exchange for compensation of any kind), sexual abuse of children, child marriage and violent acts against the sexual integrity of women, including female genital mutilation and compulsory virginity inspections.
Amendment 38 #
Proposal for a regulation Recital 11 b (new) (11 b) UNICEF defines child sexual abuse as when a child is used for the sexual stimulation of the perpetrator or the gratification of an observer. It involves any interaction in which consent does not exist or cannot be given, regardless of whether the child understands the sexual nature of the activity and even when the child shows no signs of refusal.
Amendment 39 #
Proposal for a regulation Recital 12 (12) For reasons of consistency and technological neutrality, the term ‘child sexual abuse material’ should for the purpose of this Regulation be defined as referring to any type of material constituting child pornography or pornographic performance within the meaning of Directive 2011/93/EU, which is capable of being disseminated through the use of hosting or interpersonal communication services. At present, such material typically consists of images or videos, without it however being excluded that it takes other forms, especially in view of future technological developments. Close attention should be paid to the development of new technologies and platforms, such as the metaverse. In such platforms child sexual abuse material might be generated and exchanged or child sexual abuse perpetrated through the use of avatars or any other form of virtual identities.
Amendment 40 #
Proposal for a regulation Recital 13 a (new) (13 a) The term "online grooming" refers to the process by which an adult tries to manipulate a child in order to obtain sexual audiovisual material or to have some kind of in-person sexual relationship with the child. According to international studies to date, between 5% and 15% of minors have been sexually solicited by adults through ICTs. Within the prevention measures, we must consider the responsible use of ICTs as a fundamental part of awareness-raising and education, where it is crucial to raise awareness of the implications of online consent to the use and dissemination of personal data, images or other information.
Amendment 41 #
Proposal for a regulation Recital 13 b (new) (13 b) In order to minimise the risks of online child content made available by legal guardians being used for ‘grooming’ as ‘new’ child sexual abuse material, media and digital literacy programmes should be put in place to make citizens aware of their responsibility as content disseminators. In this sense, ‘digital literacy’ refers to skills, knowledge and understanding that allows users to gain awareness on the potential risks associated with the child content they generate, produce and share, in the context of the child’s fundamental rights, and the obligations set out in this Regulation and in other Union data related Regulations. Consequently, the Union and its Member States should allocate more investments in education and training to spread digital literacy, and ensure that progress in that regard is closely followed.
Amendment 42 #
Proposal for a regulation Recital 17 a (new) (17 a) Member States continue to struggle with putting in place effective prevention programmes to combat child sexual abuse as required in Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children and child pornography, where frequently multiple types of stakeholders need to take action. As a result, children and the persons in their environment are insufficiently aware of the risks of sexual abuse and of the means of limiting such risks, while the online dimension represents a particular challenge, with constant growing tendency. As education plays a key role in the prevention of child sexual abuse, Member States should inform the public, by all means necessary, about the dangers and risks of sexual abuse for young people in the digital world, including by ensuring a close cooperation at European and international level and by strengthening work with organised civil society, in particular with schools and law enforcement representatives. Member States should take appropriate means to include programmes to this effect in the early education curricula.
Amendment 43 #
Proposal for a regulation Recital 18 a (new) (18 a) Basic digital skills, including cyber hygiene, cyber safety, data protection and media literacy are essential for children and young people, as they enable them to make informed decisions, assess and overcome the risks associated with the internet. Therefore, it is important to strengthen media literacy efforts in Member States and at the Union level, through dedicated media literacy education, publicly available relevant materials adapted for different age groups and information campaigns for children and their guardians.
Amendment 44 #
Proposal for a regulation Recital 22 (22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of
Amendment 45 #
Proposal for a regulation Recital 24 (24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, should be in a position to take a well-informed decision on requests for the issuance of detections orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especially in connection to detection orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question. They should do so as soon as possible, having regard to the important public policy objective at stake and the need to act without undue delay to protect children, in view of the seriousness of the impact that such offences have on the physical and mental health of minors and in view of the difficulty of curbing the dissemination of material online. In particular, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for
Amendment 46 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims
Amendment 47 #
Proposal for a regulation Recital 35 a (new) Amendment 48 #
Proposal for a regulation Recital 36 (36) In order to prevent children from falling victim of abuse, providers of very large online platforms which have identified the risk of use of their service for the purpose of online child sexual abuse in line with Article 3 should provide reasonable assistance, by putting in place alert and alarm mechanisms in a prominent way on their platforms. The alert mechanism could consist of, for example, linking potential victims to the local organisations such as helplines, victims` right organisations or hotlines. Providers of very large online platforms should ensure adequate follow-up, when a report or alert is made, in the language chosen by the user for using their service. Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to
Amendment 49 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to receive free immediate psychological support or support of any other professionals and to be assisted by the EU Centre in this regard, via the Coordinating Authorities.
Amendment 50 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist
Amendment 51 #
Proposal for a regulation Recital 37 (37) To ensure the efficient management of such victim support functions, victims should be well informed about the existence of such centres and be allowed to contact and rely on the Coordinating Authority that is most accessible to them, which should channel all communications between victims and the EU Centre.
Amendment 52 #
Proposal for a regulation Recital 38 (38) For the purpose of facilitating the exercise of the victims’ right to information and of assistance and support for fast removal or disabling of access,
Amendment 53 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For
Amendment 54 #
Proposal for a regulation Recital 45 a (new) (45 a) Given the EU Centre’s particular expertise with regard to the generation and sharing of knowledge, Member States should ensure that such information is shared and promoted at national level. For this purpose, they should cooperate with partner organisations, including with semi-public organisations and hotlines, as well as with civil society. It is important to ensure that practitioners who get in close contact with child victims are adequately trained to deal with such victims, and that the situation of the victim is adequately mitigated. Therefore, the Coordinating authority should ensure that officials such as law enforcement officers, judges, prosecutors, lawyers and forensic experts and social workers cooperate with civil society and semi-public organisations.
Amendment 55 #
Proposal for a regulation Recital 49 (49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, removal orders or blocking orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, and reacting timely to the evolving trends of child sexual abuse material dissemination and monetisation, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.
Amendment 56 #
Proposal for a regulation Recital 50 (50) With a view to ensuring that providers of hosting services are aware of
Amendment 57 #
Proposal for a regulation Recital 56 (56) With a view to ensuring that the indicators generated by the EU Centre for the purpose of detection are as complete as possible, the submission of relevant material and transcripts should be done proactively by the Coordinating Authorities. However, the EU Centre should also be allowed to bring certain material or conversations to the attention of the Coordinating Authorities for those purposes
Amendment 58 #
Proposal for a regulation Recital 57 a (new) (57 a) According to the UN, one of the main factors influencing the increase in child sexual abuse in developing countries is the decline in sex education. Studies have shown that if a child receives good sex education, it can equip them with the necessary tools to identify situations in which they may be sexually abused. Therefore, the education sector and education and awareness programmes play a key role in preventing child sexual abuse.
Amendment 59 #
Proposal for a regulation Recital 57 b (new) (57 b) Some studies point to depression and loneliness and a history of physical or psychological harassment as some of the characteristics of Internet-initiated victims of sexual crimes. Other studies distinguish two types of victims: risky victims and vulnerable victims. Vulnerable victims are defined as those with a high need for affection due to feelings of loneliness and low self-esteem. This shows that bullying and cyberbullying problems can lead to some children being prone to physical and online sexual abuse.
Amendment 60 #
Proposal for a regulation Recital 58 (58) In particular, in order to facilitate the cooperation needed for the proper functioning of the mechanisms set up by this Regulation, the EU Centre should establish and maintain the necessary information-sharing systems. When establishing and maintaining such systems, the EU Centre should cooperate with the European Union Agency for Law Enforcement Cooperation (‘Europol’), national hotlines and national authorities to build on existing systems and best practices, where relevant.
Amendment 61 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this
Amendment 62 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access
Amendment 63 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of
Amendment 64 #
Proposal for a regulation Recital 61 (61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities or when appropriate, by the organisations acting in the public interest against child sexual abuse, in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different child sexual abuse material (new material), or the solicitation of children, as applicable.
Amendment 65 #
Proposal for a regulation Recital 62 (62) For the system established by this Regulation to function properly, the EU Centre should be charged with creating databases for each of those three types of online child sexual abuse, and with maintaining, timely updating and operating those databases. For accountability purposes and to allow for corrections where needed, it should keep records of the submissions and the process used for the generation of the indicators.
Amendment 66 #
Proposal for a regulation Recital 62 (62) For the system established by this Regulation to function properly, the EU Centre should be charged with creating databases for each
Amendment 67 #
(65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks, reports should pass through the EU Centre. The EU Centre should immediately assess
Amendment 68 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of
Amendment 69 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse, including on the successful initiatives and good practices on the proactive search for online child sexual material, trends in its creation and monetisation, as well as the voluntary prevention, detection and mitigation of online child sexual abuse. In this connection, the EU Centre should cooperate on a regular basis with relevant stakeholders from both within and outside the Union, including law enforcement authorities with the relevant expertise, educators, civil society, service providers and industry representatives, and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned.
Amendment 70 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse
Amendment 71 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned. For this scope, the EU Centre can also aid in the implementation of awareness campaigns and contribute to the establishment and improvement of specific guidelines and proposals for mitigation measures respectively, so as to ensure accuracy and up to date solutions in tackling online child sexual abuse.
Amendment 72 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also
Amendment 73 #
Proposal for a regulation Recital 67 a (new) (67 a) In carrying out its mission, the EU Centre should also ensure transversal cooperation with education facilities, where appropriate, and digital education hubs, to also integrate this dimension of the prevention component, in order for children to become aware of the potential risks posed by the online environment.
Amendment 74 #
Proposal for a regulation Recital 67 b (new) (67 b) Considering the essential role teachers can play in guiding children on safely using information society services and detecting potentially malicious behaviour online, teacher training should be organized and implemented across the Union, in a coherent manner, benefitting from the knowledge and expertise of the EU Centre.
Amendment 75 #
Proposal for a regulation Recital 69 (69) In order to allow for the effective and efficient performance of its tasks, the EU Centre should closely cooperate with Coordinating Authorities, the Europol and relevant partner organisations, such as the US National Centre for Missing and Exploited Children or the International Association of Internet Hotlines (‘INHOPE’) network of hotlines for reporting child sexual abuse material, within the limits sets by this Regulation and other legal instruments regulating their respective activities. To facilitate and support such cooperation and build on the best practices and expertise acquired, the necessary arrangements should be made, including the designation of contact officers by Coordinating Authorities and the conclusion of memoranda of understanding with Europol and, where appropriate, with one or more of the relevant partner organisations located in the Union and outside the Union.
Amendment 76 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Child helplines are equally in the frontline in the fight against online child sexual abuse. Therefore, the EU Centre should also recognise the work of child helplines in victim response, and the existing referral mechanisms between child helplines and hotlines. The EU Centre should coordinate services for victims.
Amendment 77 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines and organisations which act in the public interest against child sexual abuse and which proactively search for child sexual abuse material or which do research and gather information on the trends in the dissemination and monetisation of child sexual abuse material, are in the frontline
Amendment 78 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across
Amendment 79 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Furthermore, a special green line with a call centre assistance service will be constituted at EU level in order for victims and their families to receive support in a timely manner.
Amendment 80 #
Proposal for a regulation Recital 70 a (new) (70 a) In line with Directive 2011/93/EU of the European Parliament and of the Council, this Regulation recognises and safeguards the key role of hotlines in order to enhance the fight against child sexual abuse online in the European Union. Hotlines have a track-record of proven capability since 1999 in the identification and removal of child sexual abuse material from the digital environment and have created a worldwide network and procedures for the child sexual abuse identification and removal. Member States should therefore promote and safeguard the role of formally recognized non-governmental organizations involved in anonymous public reporting of child sexual abuse material, which are at the forefront of detecting new child sexual abuse material, which is an essential factor in finding new victims while also keeping the databases of indicators up to date.
Amendment 81 #
Proposal for a regulation Recital 72 a (new) (72 a) In view of ensuring an adequate degree of expertise and skills for investigative purposes, specialized training of law enforcement officers will be introduced with the support of the EU Centre, especially considering rapid technological advancements where new methods, techniques and instruments require adapting preventive and mitigation efforts regarding online child sexual abuse.
Amendment 82 #
Proposal for a regulation Recital 73 (73) To ensure its proper functioning, the necessary rules should be laid down regarding the EU Centre’s organisation. In the interest of consistency, those rules should be in line with the Common Approach of the European Parliament, the Council and the Commission on decentralised agencies. In order to complete its tasks, the EU Centre and Coordinating authorities should have the necessary funds, human resources, investigative powers and technical capabilities to seriously and effectively pursue and investigate complaints and potential offenders, including appropriate training to build capacity in the judiciary and police units and to develop new high- tech capabilities to address the challenges of analysing vast amounts of child abuse imagery, including material hidden on the ‘dark web’.
Amendment 83 #
Proposal for a regulation Recital 74 (74) In view of the need for technical expertise in order to perform its tasks, in
Amendment 84 #
Proposal for a regulation Recital 74 (74) (74) In view of the essential need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies, including software, that can be used for fast detection, the EU Centre should have a Technology Committee composed of experts with advisory function, which should take into account Member States' experience and their achievements. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.
Amendment 85 #
Proposal for a regulation Recital 74 (74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to prevention and detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.
Amendment 86 #
Proposal for a regulation Recital 75 (75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal
Amendment 87 #
Proposal for a regulation Recital 76 (76) In the interest of good governance and drawing on the statistics and information gathered and transparency reporting mechanisms provided for in this Regulation, the Commission should carry out an evaluation of this Regulation within
Amendment 88 #
Proposal for a regulation Recital 76 (76) In the interest of good governance and drawing on the statistics and information gathered and transparency reporting mechanisms provided for in this Regulation, the Commission should carry out an evaluation of this Regulation within
Amendment 89 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point e a (new) (e a) Guidelines on creation of appropriate prevention techniques on cyber grooming and the dissemination of CSAM online, targeting children and parents and empowering them to use digital technologies safely and responsibly.
Amendment 90 #
Proposal for a regulation Article 2 – paragraph 1 – point k a (new) (k a) "child sexual abuse" means any actual or threatened physical intrusion, virtual or threatened intrusion of a sexual nature, for the sexual stimulation of the offender or an observer, made towards minors, whether by force or under unequal or coercive conditions;
Amendment 91 #
Proposal for a regulation Article 2 – paragraph 1 – point o a (new) (o a) "online grooming" is the process by which an adult attempts to manipulate via ICT a minor in order to obtain sexual audiovisual material or to engage in some form of face-to-face sexual relationship with that minor;
Amendment 92 #
Proposal for a regulation Article 2 – paragraph 1 – point p (p) ‘online child sexual abuse’ means the online dissemination of child sexual abuse material and the solicitation of children with the intention of violence/sexual abuse;
Amendment 93 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (w a) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous information from the public about alleged child sexual abuse material and online child sexual exploitation, which meets all the following criteria: (a) is officially recognised by its home Member State as expressed in the Directive 2011/93/EU of the European Parliament and of the Council; (b) has the mission of combatting child sexual abuse material in its articles of association; and (c) is part of a recognised and well-established international network of hotlines as referred to in this article.
Amendment 94 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (w a) `very large online platform` means online platforms which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online platforms pursuant to paragraph 4 of Article 33 of Regulation (EU) 2022/2065;
Amendment 95 #
Proposal for a regulation Chapter I a (new) Ia PREVENTION AND EDUCATION PROGRAMMES Article 2 a (new) 1. Member States shall take appropriate measures, such as education, awareness raising campaigns and training, to discourage and reduce the demand that fosters all forms of sexual exploitation of children in the online environment. 2. Member States shall take appropriate action, including through the Internet, such as information and awareness- raising campaigns, research and early- education programmes, where appropriate in cooperation with relevant civil society organisations acting in the public interest against child sexual abuse, law enforcement authorities and other stakeholders, aimed at raising awareness and reducing the risk of children becoming victims of sexual abuse or of exploitation online. 3. Member States shall promote regular training for officials likely to come into contact with child victims of sexual abuse or exploitation online, including the solicitation of children, aimed at enabling them to identify and deal with child victims and potential child victims. 4. Member States shall promote regular training for officials to inform them and update their knowledge on the latest trends in the creation, dissemination and monetization of child sexual abuse materials and national data hosting of child sexual abuse material.
Amendment 96 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 Prior to preparing its risk assessment, the provider shall be advised upon the specific requirements in order to ensure that the risk assessment is thorough, accurate and detailed. The provider may request the EU Centre to perform an analysis of representative, anonymized data samples to identify potential online child sexual abuse, to support the risk assessment.
Amendment 97 #
Proposal for a regulation Article 3 – paragraph 6 a (new) 6 a. The EU Centre should use these risk assessment reports to prepare and adapt prevention techniques to the attention of Coordinating Authorities across the EU.
Amendment 98 #
Proposal for a regulation Article 4 – paragraph 1 – point b a (new) (b a) to provide, through appropriate technical and operational measures, readily accessible and easy-to-use parental tools to help parents or guardians support children and identify harmful behaviour;
Amendment 99 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations, hotlines or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
source: 739.506
2023/03/09
IMCO
513 amendments...
Amendment 158 #
Proposal for a regulation Title 1 Amendment 159 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life,
Amendment 160 #
Proposal for a regulation Recital 1 a (new) (1 a) Regulatory measures to address the dissemination of child sexual abuse content online should be complemented by Member States strategies including increasing public awareness, how to seek child-friendly and age appropriate reporting and assistance and informing about victims rights. Additionally Member States should make sure they have a child-friendly justice system in place in order to avoid further victimisation of the abused children.
Amendment 161 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being
Amendment 162 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services for the digital single market, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, effective, carefully balanced and proportionate, so as to avoid any
Amendment 163 #
Proposal for a regulation Recital 3 (3)
Amendment 164 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, may have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
Amendment 165 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is demonstrably and durably effective and that respects the fundamental rights of all parties concerned. In view of the fast- changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future-
Amendment 166 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform
Amendment 167 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available number- independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those services a
Amendment 168 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this
Amendment 169 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services
Amendment 170 #
Proposal for a regulation Recital 6 (6) Online child sexual abuse
Amendment 171 #
Proposal for a regulation Recital 7 (7) This Regulation should be without prejudice to the rules resulting from other Union acts, in particular Directive 2011/93 of the European Parliament and of the Council38, Directive 2000/31/EC of the European Parliament and of the Council39and Regulation (EU)
Amendment 172 #
Proposal for a regulation Recital 8 (8) This Regulation should be considered lex specialis in relation to the generally applicable framework set out in Regulation (EU)
Amendment 173 #
Proposal for a regulation Recital 9 (9) Article 15(1) of Directive 2002/58/EC allows Member States to adopt legislative measures to restrict the scope of the rights and obligations provided for in
Amendment 174 #
Proposal for a regulation Recital 10 (10) In the interest of clarity and consistency, the definitions provided for in this Regulation should, where possible and appropriate, be based on and aligned with the relevant definitions contained in other acts of Union law, such as Regulation (EU)
Amendment 175 #
Proposal for a regulation Recital 11 (11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number , in relation to population size of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of a software application in the relevant national software application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1), point (c), of Regulation (EU) 1215/2012 of the European Parliament and of the Council44. Mere technical accessibility of a website from the Union
Amendment 176 #
Proposal for a regulation Recital 13 (13)
Amendment 177 #
Proposal for a regulation Recital 13 a (new) (13 a) Member States should ensure that they additionally address the problem of solicitation of children by providing for efficient digital education. Children should be given at home and in school the necessary digital skills and tools they need to fully benefit from online access, whilst ensuring their safety.
Amendment 178 #
Proposal for a regulation Recital 14 (14) With a view to minimising the risk that their services are misused for the dissemination of
Amendment 179 #
Proposal for a regulation Recital 15 (15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] with respect to information that they store and disseminate to the public , which should form the basis for the risk assessment under this instrument. For the purposes of the present Regulation, those providers may draw on such a risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as required by this Regulation.
Amendment 180 #
Proposal for a regulation Recital 15 (15) Some of those providers of relevant information society services in scope of this Regulation, including online search engines, may also be subject to an obligation to conduct a risk assessment under Regulation (EU)
Amendment 181 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers s
Amendment 182 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable specific measures to mitigate
Amendment 183 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU)
Amendment 184 #
Proposal for a regulation Recital 16 a (new) (16 a) To further prevent online child sexual abuse effectively, an emphasis should be placed on public awareness raising, including through easily understandable campaigns and in education with a focus on empowerment of young people to use the internet safely and to address societal factors that enable child sexual abuse, including harmful gender norms about women and girls and broader issues of societal inequality; In addition awareness raising should focus on hotlines where young people can report what has happened to them, as well as to improve access to institutional reporting by police and social services and other authorities.
Amendment 185 #
Proposal for a regulation Recital 16 a (new) (16 a) The used age assessing tools should be able to prove age in an efficient, privacy-preserving and secure manner.
Amendment 186 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in
Amendment 187 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory
Amendment 188 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement also voluntary measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers
Amendment 189 #
Proposal for a regulation Recital 17 a (new) (17 a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any restrictions of encryption could potentially be abused by malicious third parties. In order to ensure effective consumer trust, nothing in this Regulation should be interpreted as the requirement to prevent, circumvent, compromise, undermine encryption in place, or prohibit providers of information society services from providing their services applying encryption, restricting or undermining such encryption in the sense of being detrimental to users’ expectations of confidential and secure communication services, for example by implementation of client side scanning or other device- related, server-side solutions or requirements to proactively forward electronic communications to third parties which may weaken or introduce vulnerabilities into the encryption. Member States should not deter nor prevent providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third- party access.
Amendment 190 #
Proposal for a regulation Recital 17 a (new) (17 a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any restrictions of encryption could potentially be abused by malicious third parties, including deployment of software on end- user devices which can inspect private messages before encryption and transfer this information to third parties . In order to ensure effective consumer trust, nothing in this Regulation should be interpreted as prohibiting providers of information society services from providing their services applying encryption, restricting or undermining such encryption in the sense of being detrimental to users’ expectations of confidential and secure communication services. Member States should not prevent or discourage providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third party access.
Amendment 191 #
Proposal for a regulation Recital 17 a (new) (17 a) Relying on providers for risk mitigation measures comes with inherent problems, as business models, technologies and crimes evolve continuously. As a result, clear targets, oversight, review and adaptation, led by national supervisory authorities are needed, to avoid measures becoming redundant, disproportionate, ineffective, counterproductive and outdated.
Amendment 192 #
Proposal for a regulation Recital 17 a (new) (17 a) Fundamental rights in the digital sphere have to be guaranteed to the same extent as in the offline world. Safety and privacy need to be ensured, amongst others through end-to-end encryption in private online communication and the protection of private content against any kind of general or targeted surveillance, be it by public or private actors.
Amendment 193 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such
Amendment 194 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available interpersonal communications services should, when designing and implementing the mitigation measures, give importance
Amendment 195 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number-independent interpersonal communications services should, when designing and implementing the
Amendment 196 #
Proposal for a regulation Recital 19 (19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable measures
Amendment 197 #
Proposal for a regulation Recital 19 (19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take
Amendment 198 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when
Amendment 199 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be a last resort measure and subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that in particular solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to
Amendment 200 #
Proposal for a regulation Recital 20 a (new) (20 a) Having regard to the need to take due account of the fundamental rights guaranteed under the Charter of all parties concerned, any action taken by a provider of relevant information society services should be strictly targeted, in the sense that it should serve to detect, remove or disable access to the specific items of information considered to constitute child sexual abuse online, without unduly affecting the freedom of expression and of information of recipients of the service. Orders should therefore, as a general rule, be directed to the entity acting as a data controller or where that is unfeasible, to the specific provider of relevant information society services that has the technical and operational ability to act against such specific items of child sexual abuse material, so as to prevent and minimise any possible negative effects on the availability and accessibility of information that is not illegal content. The providers of relevant information society services who receive an order on the basis of which they cannot, for technical or operational reasons, remove the specific item of information, should inform the person or entity who submitted the order.
Amendment 201 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those limits and safeguards,
Amendment 202 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those limits and safeguards, detection orders should only be issued after a diligent and objective assessment leading to the finding of a significant risk of the specific service concerned being misused for a given type of online child sexual abuse covered by this Regulation. Such assessments may include the voluntary use of detection technologies and the evidence they provide with regard to the risks of a service being misused. One of the elements to be
Amendment 203 #
Proposal for a regulation Recital 22 (22)
Amendment 204 #
Proposal for a regulation Recital 22 (22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority
Amendment 205 #
Proposal for a regulation Recital 23 (23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a
Amendment 206 #
Proposal for a regulation Recital 23 (23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted, justified, proportionate, limited in time and specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
Amendment 207 #
Proposal for a regulation Recital 23 a (new) (23 a) Monitoring private communications of all users of a number- independent interpersonal communications service in a general and indiscriminate manner is likely to infringe on the essence of their fundamental rights and the prohibition of general monitoring. To the greatest extent possible, and as the predominant rule, detection orders should be targeted against users for whom there is a reasonable suspicion that they have been sharing child sexual abuse material in the past or that they will share child sexual abuse material in the future.
Amendment 208 #
Proposal for a regulation Recital 24 (24) The competent judicial authority
Amendment 209 #
Proposal for a regulation Recital 24 (24) The competent judicial authority
Amendment 210 #
Proposal for a regulation Recital 25 Amendment 211 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of
Amendment 212 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available number-independent interpersonal communications services to execute
Amendment 213 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users,
Amendment 214 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the
Amendment 215 #
Proposal for a regulation Recital 28 (28) With a view to constantly assess the performance of the
Amendment 216 #
Proposal for a regulation Recital 29 (29)
Amendment 217 #
Proposal for a regulation Recital 29 a (new) (29 a) In order to ensure effective prevention and fight against online child sexual abuse the providers should be able to make voluntary use of detection technologies as part of their mitigation measues, if they assess this as necessary in order to limit the risk of misuse.
Amendment 218 #
Proposal for a regulation Recital 29 b (new) (29 b) All relevant providers should provide for easily accessible, child- friendly and age appropriate notification mechanisms that allow for a quick, efficient and privacy-preserving notification. Micro, small and medium sized enterprises should get support from the EU Centre to build up a corresponding mechanism.
Amendment 219 #
Proposal for a regulation Recital 31 (31) The rules of this Regulation should not be understood as affecting the
Amendment 220 #
Proposal for a regulation Recital 32 (32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited.
Amendment 221 #
Amendment 222 #
Proposal for a regulation Recital 34 (34)
Amendment 223 #
Proposal for a regulation Recital 40 (40) In order to facilitate smooth and efficient communications by electronic means, including, where relevant, by acknowledging the receipt of such communications, relating to matters covered by this Regulation, providers of relevant information society services should be required to designate a single point of contact and to publish relevant information relating to that point of contact, including the languages to be used in such communications. In contrast to the provider’s legal representative, the point of contact should serve operational purposes and should not be required to have a physical location. Suitable conditions should be set in relation to the languages of communication to be specified, so as to ensure that smooth communication is not unreasonably complicated. For providers subject to the obligation to establish a compliance function and nominate compliance officers in accordance with Regulation (EU)
Amendment 224 #
Proposal for a regulation Recital 42 (42) Where relevant and convenient, subject to the choice of the provider of relevant information society services and the need to meet the applicable legal requirements in this respect, it should be possible for those providers to designate a single point of contact and a single legal representative for the purposes of Regulation (EU)
Amendment 225 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to
Amendment 226 #
Proposal for a regulation Recital 49 (49) In order to verify that the rules of this Regulation, in particular those on
Amendment 227 #
Proposal for a regulation Recital 50 Amendment 228 #
Proposal for a regulation Recital 55 (55) It is essential for the proper functioning of
Amendment 229 #
Proposal for a regulation Recital 55 a (new) (55 a) All communications containing illegal material should be encrypted to state of the art standards, all access by staff to such content should be limited to what is necessary and thoroughly logged.
Amendment 230 #
Proposal for a regulation Recital 56 (56) With a view to ensuring that the indicators generated by the EU Centre
Amendment 231 #
Proposal for a regulation Recital 69 a (new) (69 a) Hotlines play an invaluable role in providing the public with a way to report suspected child sexual abuse material and by rapidly removing harmful content online, but they have different legal rights to process child sexual abuse material and therefore Member Stats are encouraged to aim for a harmonisation of the legal capacities of hotlines.
Amendment 232 #
Proposal for a regulation Recital 70 (70) This Regulation recognises and reinforces the key role of hotlines in optimising the fight against child sexual abuse online at the Union level. Hotlines are at the forefront of detecting new child sexual abuse material and have a track record of proven capability in the rapid identification and removal of child sexual abuse material from the digital environment. Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union.
Amendment 233 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines, concluding, when necessary, strategic and/or operational cooperation agreements with them and encourage that they
Amendment 234 #
Proposal for a regulation Recital 78 (78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45provides for a temporary solution in respect of the use of technologies by certain providers of publicly available interpersonal communications services for the purpose of combating online child sexual abuse
Amendment 235 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in
Amendment 236 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to prevent and address the misuse of relevant information society services for online child sexual abuse
Amendment 237 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the
Amendment 238 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on relevant providers of hosting
Amendment 239 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services and providers of publicly available number-independent interpersonal communication services to ide
Amendment 240 #
Proposal for a regulation Article premier – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services
Amendment 241 #
(b) obligations on relevant providers of
Amendment 242 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point c (c) obligations on
Amendment 243 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d a (new) (d a) obligations on providers of online search engines to delist websites indicating child sexual abuse material;
Amendment 244 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point e a (new) (e a) Obligations on providers of online games;
Amendment 245 #
Proposal for a regulation Article 1 – paragraph 3 – point b (b) Directive 2000/31/EC and Regulation (EU)
Amendment 246 #
Proposal for a regulation Article premier – paragraph 3 – point b a (new) (ba) Regulation (EU) .../... [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts], in particular Article 5;
Amendment 247 #
Proposal for a regulation Article 1 – paragraph 4 4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the execution of the
Amendment 248 #
Proposal for a regulation Article 2 – paragraph 1 – point a (a) ‘hosting service’ means an information society service as defined in Article 2, point (f), third indent, of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]; in so far they allow the dissemination and sharing of images and videos;
Amendment 249 #
Proposal for a regulation Article 2 – paragraph 1 – point a (a) ‘hosting service’ means an information society service as defined in Article
Amendment 250 #
Proposal for a regulation Article 2 – paragraph 1 – point a (a) ‘hosting service’ means an information society service as defined in Article
Amendment 251 #
Proposal for a regulation Article 2 – paragraph 1 – point a a (new) (a a) 'cloud computing service' means a service as defined in Article 6, point (30), of Directive (EU) 2022/2555 of the European Parliament and of the Council.
Amendment 252 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘number-independent interpersonal communications service’ means a publicly available service as defined in Article 2, point
Amendment 253 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘interpersonal communications service’ means a publicly available service as defined in Article 2, point 5, of Directive (EU) 2018/1972, including services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service in so far they allow the dissemination and sharing of images and videos;
Amendment 254 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘interpersonal communications service’ means a publicly available service as defined in Article 2, point 5, of
Amendment 255 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘interpersonal communications service’ means a publicly available service as defined in Article 2, point 5, of Directive (EU) 2018/1972,
Amendment 256 #
Proposal for a regulation Article 2 – paragraph 1 – point b a (new) (b a) ‘number-independent interpersonal communications service within games’ means any service defined in Article 2, point 7 of Directive (EU) 2018/1972 which is part of a game.
Amendment 257 #
Proposal for a regulation Article 2 – paragraph 1 – point c (c) ‘software application’ means a digital product or service as defined in Article 2, point 1
Amendment 258 #
Proposal for a regulation Article 2 – paragraph 1 – point d (d) ‘software application store’ means a service as defined in Article 2, point 1
Amendment 259 #
Proposal for a regulation Article 2 – paragraph 1 – point e Amendment 260 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point ii Amendment 261 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point ii (ii) a
Amendment 262 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv Amendment 263 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv a (new) (iv a) online search engines;
Amendment 264 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv a (new) (iv a) online games;
Amendment 265 #
Proposal for a regulation Article 2 – paragraph 1 – point f a (new) (f a) “Online search engine” means an intermedietary service as defined in Article 3 point (j) of Regulation (EU) 2022/2065;
Amendment 266 #
Proposal for a regulation Article 2 – paragraph 1 – point f b (new) (f b) ‘metadata‘ means data processed for the purposes of transmitting, distributing or exchanging content data; including data used to trace and identify the source and destination of a communication, data on the location of the user, and the date, time, duration and the type of communication;
Amendment 267 #
Proposal for a regulation Article 2 – paragraph 1 – point g (g) ‘to offer services in the Union’ means to offer services in the Union as defined in Article 2
Amendment 268 #
Proposal for a regulation Article 2 – paragraph 1 – point h a (new) (h a) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous report from the public about alleged child sexual abuse material and online child sexual exploitation, which is officially recognised by the Member State of establishment and has the mission of combatting child sexual abuse;
Amendment 269 #
Proposal for a regulation Article 2 – paragraph 1 – point h b (new) (h b) ‘help-line’ means an organisation providing services for children in need as recognised by the Member State of establishment;
Amendment 270 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 1
Amendment 271 #
Proposal for a regulation Article 2 – paragraph 1 – point r (r) ‘recommender system’ means the system as defined in Article
Amendment 272 #
Proposal for a regulation Article 2 – paragraph 1 – point t (t) ‘content moderation’ means the activities as defined in Article
Amendment 273 #
Proposal for a regulation Article 2 – paragraph 1 – point v (v) ‘terms and conditions’ means terms
Amendment 274 #
Proposal for a regulation Article 2 a (new) Article 2 a End-to-End Encryption and Prohibition on General Monitoring 1. End-to-end encryption is essential to guarantee the security, confidentiality of the communications of users, including those of children. Any restrictions of encryption could lead to abuse by malicious actors. Nothing in this Regulation should be interpreted as prohibiting providers of information society services from providing their services applying end-to-end encryption, restricting or undermining such encryption. Member States should not prevent providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third party access. 2. Nothing in this Regulation should undermine the prohibition of general monitoring under EU law.
Amendment 275 #
Proposal for a regulation Article 2 a (new) Article 2 a Voluntary own-initiative detection Providers of relevant information society services shall be deemed eligible to carry out own-initiative investigations into, or take other measures aimed at detecting, identifying and preventing dissemination or removing child sexual abuse on their services in addition to mandatory requirements foreseen in this Regulation.
Amendment 276 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services
Amendment 277 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of number-independent interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of use of the service for the purpose of online child sexual abuse. This risk assessment shall be specific to their services and proportionate to the risks, taking into consideration their severity and probability and in full respect to the fundamental rights enshrined in the Charter.
Amendment 278 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of
Amendment 279 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of interpersonal communications services shall
Amendment 280 #
Proposal for a regulation Article 3 – paragraph 1 a (new) 1 a. A hosting service provider or publicly available number-independent interpersonal communication service is exposed to child sexual abuse material where the coordinating authority of the Member State of its main establishment or where its legal repr esentative resides or is established has: a) taken a decision, on the basis of objecti ve factors, such as the provider having rec eived two or more final removal orders in the previous 12 m onths, finding that the provider is exposed to child sexual abuse material;and b) notified the decision referred to in point (a) to the provider.
Amendment 281 #
Proposal for a regulation Article 3 – paragraph 2 – point a Amendment 282 #
Proposal for a regulation Article 3 – paragraph 2 – point a a (new) (a a) any actual or foreseeable negative effects for the exercise of fundamental rights
Amendment 283 #
Proposal for a regulation Article 3 – paragraph 2 – point b – introductory part (b) the existence and implementation by the provider of a policy and the availability and effectiveness of functionalities to address the
Amendment 284 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 1 Amendment 285 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 Amendment 286 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 — concrete measures taken to enforce such prohibitions and restrictions;
Amendment 287 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities enabling
Amendment 288 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities enabling
Amendment 289 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities enabling
Amendment 290 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities
Amendment 291 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities enabling age
Amendment 292 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - Functionalities enabling scanning for known child sexual abuse material on upload;
Amendment 293 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - functionalities enabling age appropriate parental control;
Amendment 294 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 b (new) - Functionalities preventing uploads from the dark web;
Amendment 295 #
Proposal for a regulation Article 3 – paragraph 2 – point b a (new) (b a) the capacity, in accordance with the state of the art, to deal with reports and notifications about child sexual abuse in a timely manner;
Amendment 296 #
Proposal for a regulation Article 3 – paragraph 2 – point c Amendment 297 #
Proposal for a regulation Article 3 – paragraph 2 – point d (d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, the design of their recommender systems and any other relevant algorithmic system and the impact thereof on that risk;
Amendment 298 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point i Amendment 299 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii Amendment 300 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 1 — enabling users to publicly search for other users and, in particular, for adult users to search for child users;
Amendment 301 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 Amendment 302 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 — enabling users to
Amendment 303 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 — enabling users to establish direct contact and share images or videos with other users, in particular through private communications.
Amendment 304 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 a (new) - Enabling users to create usernames that contain a representation about, or imply, the user’s age;
Amendment 305 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 a (new) - the extent to which children have access to age-restricted content
Amendment 306 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 b (new) - Enabling child users to create usernames that contain location information on child users;
Amendment 307 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 c (new) - Enabling users to know or infer the location of child users.
Amendment 308 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii a (new) Amendment 309 #
Proposal for a regulation Article 3 – paragraph 2 a (new) 2a. The fact that a provider of interpersonal communications services ensures that interpersonal communications remain confidential or are encrypted cannot be considered a risk factor within the meaning of this Regulation.
Amendment 310 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 The provider may request the EU Centre to perform an analysis of representative, anonymized data samples to identify potential online child sexual abuse, to support the risk assessment. This request cannot serve the purpose of evading any of the provider’s obligations set up in this Regulation. The EU Centre shall perfom the analysis in a timely manner.
Amendment 311 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 The provider may request the EU Centre to perform an analysis of
Amendment 312 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise
Amendment 313 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise
Amendment 314 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 3 Amendment 315 #
Proposal for a regulation Article 3 – paragraph 3 a (new) 3 a. Providers of hosting services and providers of interpersonal communication services shall put forward specific age assurance verification systems that meet the following criteria: (a) effectively protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose (b) do not collect data that is not strictly necessary for the purposes of age assurance; (c) be proportionate to the risks associated to the product or service that presents a risk of misuse of child sexual abuse; (d) provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
Amendment 316 #
Proposal for a regulation Article 3 – paragraph 3 a (new) 3 a. The provider may also voluntary use the measures specified in Article 10 to detect online child sexual abuse on a specific service. In this case they have to notify the Coordinating authority and include the results of its analyses in a separate section of the risk assessment.
Amendment 317 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – point a (a) for a service which is subject to a
Amendment 318 #
Proposal for a regulation Article 3 – paragraph 6 6. The Commission, in cooperation with Coordinating Authorities
Amendment 320 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services
Amendment 321 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services, excluding cloud computing services, and providers of interpersonal communications services shall take reasonable, proportionate and targeted mitigation measures, tailored to the risk identified pursuant to Article 3 and the type of service offered, to minimise that risk. Such measures shall include some or all of the following:
Amendment 322 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall take the reasonable mitigation measures set out in Article 35 of Regulation (EU) 2022/2065, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
Amendment 323 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include, but need not to be limited to, some or all of the following:
Amendment 324 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications
Amendment 325 #
Proposal for a regulation Article 4 – paragraph 1 – point a (a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service,
Amendment 326 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (a a) providing technical measures and tools that allow users to manage their own privacy, visibility, reachability and safety , and that are set to the most secure levels by default;
Amendment 327 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (a a) introducing parental control features and functionalities that allow the parents or the legal guardians to exercise oversight and control over the child's activity;
Amendment 328 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (a a) adapting privacy and safety by design and by default for children, including age appropriate parental control tools;
Amendment 329 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (a b) informing users about external resources and services in the user’s region on preventing child sexual abuse, counselling by help-lines, victim support and educational resources by hotlines and child protection organisation;
Amendment 330 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (a b) implementing measures to prevent and combat the dissemination of online child sex abuse materials;
Amendment 331 #
Proposal for a regulation Article 4 – paragraph 1 – point a c (new) (a c) providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line
Amendment 332 #
Proposal for a regulation Article 4 – paragraph 1 – point a d (new) (a d) automatic mechanisms and interface design elements to inform users about external preventive intervention programmes
Amendment 333 #
Proposal for a regulation Article 4 – paragraph 1 – point b (b)
Amendment 334 #
Proposal for a regulation Article 4 – paragraph 1 – point b a (new) (b a) processing metadata;
Amendment 335 #
Proposal for a regulation Article 4 – paragraph 1 – point c Amendment 336 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU)
Amendment 337 #
Proposal for a regulation Article 4 – paragraph 1 – subparagraph 1 (new) Amendment 338 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (c a) foreseeing awareness-raising measures;
Amendment 339 #
Proposal for a regulation Article 4 – paragraph 1 – point c b (new) (c b) using any other measures in accordance with the current or future state of the art that are fit to mitigate the identified risk;
Amendment 340 #
Proposal for a regulation Article 4 – paragraph 2 – introductory part 2. The
Amendment 341 #
Proposal for a regulation Article 4 – paragraph 2 – point a (a) effective and proportionate in mitigating the identified serious risk;
Amendment 342 #
Proposal for a regulation Article 4 – paragraph 2 – point a (a) effective and efficient in mitigating the identified risk;
Amendment 343 #
Proposal for a regulation Article 4 – paragraph 2 – point a a (new) (a a) subject to an implementation plan with clear objectives and methodologies for identifying and quantifying impacts on the identified serious risk and on the exercise of the fundamental rights of all affected parties. The implementation plan shall be reviewed every six months.
Amendment 344 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) applied in line with the right to privacy and the safety of individuals, targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technological capabilities and the number of users;
Amendment 345 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk , any impact on the functionality of the service as well as the provider’s financial and technological capabilities and the number of users;
Amendment 346 #
Proposal for a regulation Article 4 – paragraph 2 – point d a (new) (d a) Providers of hosting services and providers of interpersonal communications services are encouraged to put in place voluntary measures to detect and report online child sexual abuse for those services that have proven to pose a risk of misuse for child sexual abuse, or in cases there is an imminent risk of misue for child sexual abuse, including for the purpose of the solicitation of children;
Amendment 347 #
Proposal for a regulation Article 4 – paragraph 2 a (new) 2a. The requirement that the providers of interpersonal communications services take risk mitigation measures shall in no way constitute a requirement that they access the content of communications or make provision for methods to access these communications or to compromise their encryption.
Amendment 348 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 349 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 350 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures and to put in place effective measure to block the access of children to websites that fall under an age-restriction applicable under national law.
Amendment 351 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age
Amendment 352 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary
Amendment 353 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of number-independant interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary
Amendment 354 #
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall immediately take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures.
Amendment 355 #
Proposal for a regulation Article 4 – paragraph 3 a (new) 3 a. Any requirement to take specific measures shall be without prejudice to Article 8 of Regulation (EU) 2022/2065 and shall entail neither a general obligation for hosting services providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.
Amendment 356 #
Proposal for a regulation Article 4 – paragraph 4 4. Providers of hosting services and providers of interpersonal communications services shall clearly describe in their terms and conditions the mitigation measures that they have taken. That description shall not include information that may reduce the effectiveness of the mitigation measures, unless the measures impinge on the essence of the service underlying the contract of use, or unless they amend, derogate from or invalidate another clause in the provider's terms and conditions.
Amendment 357 #
Proposal for a regulation Article 4 – paragraph 4 4.
Amendment 358 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities and the EU
Amendment 359 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1
Amendment 360 #
Proposal for a regulation Article 4 a (new) Article 4 a Legal basis for risk mitigation through metadata processing 1. To the extent necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, providers of number independent interpersonal communication services shall be allowed, as a mitigating measure under Article 4, to process metadata. 2. All relevant service providers shall process metadata when ordered to do so by the Coordinating Authority of establishment in accordance with Article 5bis(4). When assessing whether to require a provider to process metadata, the Coordinating Authority shall take into account the interference with the rights to privacy and data protection of the users of the service that such a processing entails and determine whether, in the case at hand, the processing of metadata would be effective in mitigating the risk of use of the service for the purpose of child sexual abuse, strictly necessary and proportionate. 3. If they process metadata as a risk mitigation measure, providers shall inform their users of such processing in their terms and conditions, including information on the possibility to submit complaints.
Amendment 361 #
Proposal for a regulation Article 4 a (new) Article 4 a Specific measures for platforms primarily used for the dissemination of pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure a. user-friendly reporting mechanisms to report alleged child sexual abuse material; b. adequate professional human content moderation to rapidly process notices of alleged child sexual abuse material; c. automatic mechanisms and interface design elements to inform users about external preventive intervention programmes in the user’s region.
Amendment 362 #
Proposal for a regulation Article 4 b (new) Article 4 b Specific measures for number- independent interpersonal communications service within games Providers of online games that operate number-independent interpersonal communications service within their games shall take the necessary technical and organisational measures a) preventing users from initiating unsolicited contact with other users; b) facilitating user-friendly reporting of alleged child sexual abuse material; c) providing technical measures and tools that allow users to manage their own privacy, visibility reachability and safety. and that are set to the most secure levels by default; d) providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line.
Amendment 363 #
Proposal for a regulation Article 5 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 364 #
Proposal for a regulation Article 5 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall transmit,
Amendment 365 #
Proposal for a regulation Article 5 – paragraph 1 – point a (a)
Amendment 366 #
Proposal for a regulation Article 5 – paragraph 1 – point b (b) any
Amendment 367 #
Proposal for a regulation Article 5 – paragraph 2 2. Within three months after receiving the report, the Coordinating Authority of establishment shall assess it and determine, on that basis and taking into account any other relevant information available to it, whether the risk assessment has been carried out or updated and the
Amendment 368 #
Proposal for a regulation Article 5 – paragraph 2 2. Within
Amendment 369 #
Proposal for a regulation Article 5 – paragraph 3 – subparagraph 1 Where necessary for that assessment, that Coordinating Authority may require further information from the provider,
Amendment 370 #
Proposal for a regulation Article 5 – paragraph 3 – subparagraph 2 Amendment 371 #
Proposal for a regulation Article 5 – paragraph 4 4. Without prejudice to Articles 7 and 27 to 29, where the requirements of Articles 3 and 4 have not been met, that Coordinating Authority shall require the provider to
Amendment 372 #
Proposal for a regulation Article 5 – paragraph 6 Amendment 373 #
Proposal for a regulation Article 5 – paragraph 6 a (new) 6 a. Providers of hosting services and providers of interpersonal communications services that qualify as micro (or small) enterprises within the meaning of Article 3 of Directive 2013/34/EU shall transmit a simplified version of the report under paragraph 1 of this Article.
Amendment 374 #
Proposal for a regulation Article 5 a (new) Amendment 375 #
Proposal for a regulation Article 6 Amendment 376 #
Amendment 377 #
Proposal for a regulation Article 6 – paragraph 1 – point a (a) make reasonable efforts to
Amendment 378 #
Proposal for a regulation Article 6 – paragraph 1 – point b Amendment 379 #
Proposal for a regulation Article 6 – paragraph 1 – point b (b) take reasonable and effective measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children leading to an online child sex abuse;
Amendment 380 #
Proposal for a regulation Article 6 – paragraph 1 – point c Amendment 381 #
Proposal for a regulation Article 6 – paragraph 1 – point c (c) take the necessary age
Amendment 382 #
Proposal for a regulation Article 6 – paragraph 2 Amendment 383 #
Proposal for a regulation Article 6 – paragraph 3 Amendment 384 #
Proposal for a regulation Article 6 – paragraph 4 Amendment 385 #
Proposal for a regulation Article 6 a (new) Article 6 a Security and confidentiality of communications Nothing in this Regulation shall be construed as prohibiting, restricting or undermining the provision or the use of encrypted services. Member States shall not prevent or discourage providers of relevant information society services from offering encrypted services.
Amendment 386 #
Article 6 a Encrypted services Nothing in this Regulation shall be construed as prohibiting, restricting or undermining the provision or the use of encrypted services. Providers of information society services shall not be deterred nor prevented by relevant public authorities from offering encrypted services.
Amendment 387 #
Proposal for a regulation Article 6 a (new) Article 6 a Encrypted services Member States shall not prevent providers of relevant information society services from offering encrypted services. But when offering them, providers have to make sure that they process metadata in order to detect known child sexual abuse material.
Amendment 388 #
Proposal for a regulation Article 6 a (new) Article 6 a Security of communications and services Nothing in this regulation shall be construed as encouraging the prohibition, restriction, circumvention or undermining of the provision or the use of encrypted services.
Amendment 389 #
Proposal for a regulation Article 6 b (new) Article 6 b Support for micro and small and medium sized enterprises The Commission shall be empowered to adopt delegated acts in accordance with Article 86 in order to supplement this Regulation with guidelines that foresee practical support for micro and small and medium sized enterprises in order for them to be able to fulfil the obligations of this Regulation.
Amendment 391 #
Proposal for a regulation Chapter II – Section 2 – title 2
Amendment 394 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power, as a last resort, when all the measures in Article 3, 4 and 5 have been exhausted, to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue
Amendment 395 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power, as a last resort, when all the measures in Article 3, 4 and 5 have been exhausted, to
Amendment 396 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 397 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 398 #
Proposal for a regulation Article 7 – paragraph 1 1.
Amendment 399 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The Coordinating Authority of establishment shall, before requesting the issuance of a detection order, carry out the investigations and assessments necessary to determine whether the conditions of paragraph 4 have been met. Detection orders issued by the coordinating authorities shall serve as a measure of last resort, only enacted when all mitigating measures, including voluntary ones, have proven unsuccessful.
Amendment 400 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 Amendment 401 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The Coordinating Authority of establishment shall, before requesting the issuance of a
Amendment 402 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – introductory part Where the Coordinating Authority of establishment takes the preliminary view that the conditions of paragraph 4 have been met and the measures envisaged in the detection order are proportionate, it shall:
Amendment 403 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – introductory part Where the Coordinating Authority of establishment takes the
Amendment 404 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point a (a) establish a draft request for the issuance of a
Amendment 405 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point b (b) submit the draft request to the concerned provider and the EU Centre;
Amendment 406 #
Amendment 407 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point d (d) invite the EU Centre to provide its opinion on the draft request, within a time period of
Amendment 408 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – introductory part Where, having regard to the
Amendment 409 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the measures it envisages taking to execute the intended detection order, including detailed information regarding the envisaged technologies and safeguards and their negative impacts on the rights of all parties involved, including the users of the service;
Amendment 410 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the incident that the authority intends to investigate, the measures it envisages taking to execute the intended
Amendment 411 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b)
Amendment 412 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b) where the draft implementation plan concerns an intended
Amendment 413 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b)
Amendment 414 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessment and in order to
Amendment 415 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point d (d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted
Amendment 416 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the implementation plan of the provider and the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the issuance of the detection, adjusted where appropriate, to the competent judicial
Amendment 417 #
Where, having regard to the implementation plan of the provider and taking taking utmost account of the opinion of the data protection authority, that Coordinating Authority
Amendment 418 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the implementation plan of the provider and the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the issuance of the detection order, adjusted where appropriate, to the competent judicial
Amendment 419 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority
Amendment 420 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority
Amendment 421 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part Based on a reasoned justification, The Coordinating Authority of establishment shall request the issuance of the
Amendment 422 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is evidence of a s
Amendment 423 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b (b) the reasons for issuing the detection order outweigh negative consequences for the rights and legitimate interests of all parties affected, including all users where the implementation plan would undermine the structure processing the interpersonal communications, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties.
Amendment 424 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b (b) the
Amendment 425 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b a (new) (b a) The voluntary measures applied as mitigating measures have not proven successful in preventing the misuse of the service for child sexual abuse.
Amendment 426 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 Amendment 427 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point a Amendment 428 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point b Amendment 429 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point c Amendment 430 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point d Amendment 431 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 3 Amendment 432 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 3 As regards the second subparagraph, point (d), where that Coordinating Authority substantially deviates from the opinion of the EU Centre, it shall inform the EU Centre and the Commission thereof, specifying in detail the points at which it deviated and the main reasons for the deviation.
Amendment 433 #
Proposal for a regulation Article 7 – paragraph 5 – introductory part 5. As regards
Amendment 434 #
Proposal for a regulation Article 7 – paragraph 5 – introductory part 5. As regards detection orders concerning the dissemination of known child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall
Amendment 435 #
Proposal for a regulation Article 7 – paragraph 5 – point a (a) it is likely, despite any mitigation measures that the provider may have taken
Amendment 436 #
Proposal for a regulation Article 7 – paragraph 5 – point b (b) there is evidence of the service
Amendment 437 #
Proposal for a regulation Article 7 – paragraph 5 – point b (b) there is evidence of the service,
Amendment 438 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 439 #
Proposal for a regulation Article 7 – paragraph 6 – introductory part 6. As regards detection orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall
Amendment 440 #
Proposal for a regulation Article 7 – paragraph 6 – point a (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material, including live stream and live transmission;
Amendment 441 #
Proposal for a regulation Article 7 – paragraph 6 – point b (b) there is evidence of the service
Amendment 442 #
Proposal for a regulation Article 7 – paragraph 7 Amendment 443 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – introductory part As regards detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall
Amendment 444 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – point c (c) there is evidence of the service
Amendment 445 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary to effectively address the significant risk referred to in point (a) thereof. To the greatest extent possible, the detection order should be targeted against users who can be reasonably suspected of distributing child sexual abuse material.
Amendment 446 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the judicial validation and the issuance of
Amendment 447 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial
Amendment 448 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent
Amendment 449 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that
Amendment 450 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point b (b) where necessary, in particular to limit such negative consequences, effective
Amendment 451 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point b a (new) (ba) under no circumstances, shall the detection order require providers of interpersonal communications services to access the content of communications or make provision for methods to access these communications or to compromise their encryption;
Amendment 452 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority or independent administrative authority shall specify in the detection order the period during which it applies, indicating the start date and the end date, within which the providers of hosting services and providers of interpersonal communications services shall prove that their service is no longer used for child sexual abuse.
Amendment 453 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 454 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 455 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 456 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 2 The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the
Amendment 457 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 The period of application of
Amendment 458 #
Amendment 459 #
Proposal for a regulation Article 8 – title Additional rules regarding
Amendment 460 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 461 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 462 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 463 #
Proposal for a regulation Article 8 – paragraph 1 – point a (a) information regarding the measures to be taken to execute the
Amendment 464 #
Proposal for a regulation Article 8 – paragraph 1 – point b (b) identification details of the competent judicial authority
Amendment 465 #
Proposal for a regulation Article 8 – paragraph 1 – point b (b) identification details of the competent judicial authority
Amendment 466 #
Proposal for a regulation Article 8 – paragraph 1 – point b Amendment 467 #
Proposal for a regulation Article 8 – paragraph 1 – point c (c) the name of the provider and, where applicable, its legal representative, without prejudice to the issuance of detection orders where the legal name of the provider is not readily ascertained;
Amendment 468 #
Proposal for a regulation Article 8 – paragraph 1 – point d (d) the specific service in respect of which the
Amendment 469 #
Proposal for a regulation Article 8 – paragraph 1 – point e (e) whether the
Amendment 470 #
Proposal for a regulation Article 8 – paragraph 1 – point f (f) the start date and the end date of the
Amendment 471 #
Proposal for a regulation Article 8 – paragraph 1 – point g (g) a
Amendment 472 #
Proposal for a regulation Article 8 – paragraph 1 – point h (h) the factual and legal grounds justifying the issuing of the order, and a reference to this Regulation as the legal basis for the detection order;
Amendment 473 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 474 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 475 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 476 #
Proposal for a regulation Article 8 – paragraph 1 – point j (j) easily understandable information about the redress available to the addressee of the
Amendment 477 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 478 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 479 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 480 #
The
Amendment 481 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 3 The
Amendment 482 #
Proposal for a regulation Article 8 – paragraph 3 3. If the provider cannot execute the
Amendment 483 #
Proposal for a regulation Article 8 a (new) Amendment 484 #
Proposal for a regulation Article 8 c (new) Article 8 c Notification mechanism 1. Providers of hosting services and providers of interpersonal communication services shall establish mechanisms that allow users to notify to them the presence on their service of specific items or activities that the user considers to be potential child sexual abuse material, in particular previously unknown child sexual abuse material and solicitation of children. Those mechanisms shall be easy to access and user-friendly, child-friendly and shall allow for the submission of notices exclusively by electronic means. 2. Where the notice contains the electroni c contact information of the user who submitted it , the provider shall without undue delay send a confirmation or receipt to the user. 3. Providers shall ensure that such notices are processed without undue delay.
Amendment 486 #
Proposal for a regulation Article 9 – title Redress, information, reporting and modification of
Amendment 487 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services that have received a
Amendment 488 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of interpersonal communications services that have received a detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detection order before the courts of the Member State of the competent judicial authority
Amendment 489 #
Proposal for a regulation Article 9 – paragraph 1 a (new) 1a. Exercising the right to recourse under paragraph 1 shall suspend execution of the detection order.
Amendment 490 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the detection order becomes final, the competent judicial authority
Amendment 491 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the detection order becomes final, the competent judicial authority
Amendment 492 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the
Amendment 493 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 2 For the purpose of the first subparagraph, a
Amendment 494 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 1 Where the period of application of the
Amendment 495 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 2 Those reports shall include a detailed description of the measures taken to execute the
Amendment 496 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the
Amendment 497 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the detection orders that the competent judicial authority
Amendment 498 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the detection orders that the competent judicial authority
Amendment 499 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 500 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 501 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 504 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of publicly available number- independent interpersonal communication services that have received a
Amendment 505 #
Proposal for a regulation Article 10 – paragraph 2 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies specified in the orders and made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the
Amendment 506 #
Proposal for a regulation Article 10 – paragraph 3 – introductory part 3. The technologies specified in the investigation orders shall be:
Amendment 507 #
Proposal for a regulation Article 10 – paragraph 3 – point a (a) effective in
Amendment 508 #
Proposal for a regulation Article 10 – paragraph 3 – point b (b) not be able to extract any other information from the relevant communications than the information strictly necessary to
Amendment 509 #
Proposal for a regulation Article 10 – paragraph 3 – point b a (new) (ba) respect the confidentiality of communications enshrined in Article 7 of the Charter of Fundamental Rights of the European Union and Article 8 of the Convention for the Protection of Human Rights and Fundamental Freedoms;
Amendment 510 #
Proposal for a regulation Article 10 – paragraph 3 – point c (c) in accordance with the technological state of the art
Amendment 511 #
Proposal for a regulation Article 10 – paragraph 3 – point d (d) sufficiently reliable, in that they limit to the maximum extent possible the rate of errors regarding the
Amendment 512 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (d a) effective in setting up a reliable age-based filter that verifies the age of users and effectively prevents the access of child users to websites subject to online child sexual abuse, and child sexual abuse offenses.
Amendment 513 #
Proposal for a regulation Article 10 – paragraph 4 – introductory part 4. The
Amendment 514 #
Proposal for a regulation Article 10 – paragraph 4 – point a (a) take all the necessary measures to ensure that the technologies
Amendment 515 #
Proposal for a regulation Article 10 – paragraph 4 – point a (a) take all the necessary and proportionate measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly limited to what is necessary to execute the detection orders addressed to them;
Amendment 516 #
Proposal for a regulation Article 10 – paragraph 4 – point b (b)
Amendment 517 #
Proposal for a regulation Article 10 – paragraph 4 – point c (c) include in investigation orders specific obligations on providers to ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, human intervention;
Amendment 518 #
Proposal for a regulation Article 10 – paragraph 4 – point c (c) ensure regular human oversight as necessary to ensure that the technologies operate
Amendment 519 #
Proposal for a regulation Article 10 – paragraph 4 – point c a (new) (c a) ensure privacy and safety by design and by default and, where applicable, the protection of encryption.
Amendment 520 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of
Amendment 521 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, the refusal of removal, especially of self-generated CSAM, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
Amendment 522 #
Proposal for a regulation Article 10 – paragraph 4 – point e (e) inform the Coordinating Authority, as appropriate, at the latest one month before the start date specified in the
Amendment 523 #
Proposal for a regulation Article 10 – paragraph 4 – point e a (new) (e a) Ensure safety-by-design tools such as parental controls tool and effective age verification tools.
Amendment 524 #
Proposal for a regulation Article 10 – paragraph 4 – point e a (new) (e a) ensure privacy and safety by design and by default and, where applicable, the protection of encryption;
Amendment 525 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point a Amendment 526 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point b Amendment 527 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 2 Amendment 528 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 2 The provider shall not provide information to users that may reduce the effectiveness of the measures to execute the
Amendment 529 #
Proposal for a regulation Article 10 – paragraph 6 Amendment 530 #
Proposal for a regulation Article 11 Amendment 531 #
Proposal for a regulation Article 11 – title Guidelines regarding
Amendment 532 #
Proposal for a regulation Article 11 – paragraph 1 The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue
Amendment 533 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of
Amendment 534 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the
Amendment 535 #
Proposal for a regulation Article 12 – paragraph 1 a (new) 1 a. Where a provider of hosting services or a provider of interpersonal communications services receives a report by the public through, among others, trusted hotline, it shall process and analyse the report in a timely and effective manner as to assess an imminent risk of miuse of the service for child child sexual abuse, without prejudice to the obligation to report to the EU centre pursuant paragraph 1.
Amendment 536 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 1 Where the provider submits a report pursuant to paragraph 1, it shall
Amendment 537 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 2 Amendment 538 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 3 Amendment 539 #
Proposal for a regulation Article 12 – paragraph 2 a (new) 2 a. The report submitted by the provider pursuant paragrah 2, shall never contain information about the source of the report, especially when this stems from the person to whom the material relates.
Amendment 540 #
Proposal for a regulation Article 12 – paragraph 3 3.
Amendment 541 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to flag to the provider potential online child sexual abuse on the service, including child-friendly mechanisms of self- generated content self-reporting.
Amendment 542 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible, effective, age- appropriate and user-friendly mechanism that allows users to flag to the provider potential online child sexual abuse on the service.
Amendment 543 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to easily flag to the provider potential online child sexual abuse on the service.
Amendment 544 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 545 #
Proposal for a regulation Article 13 – paragraph 1 – point c (c) all
Amendment 546 #
Proposal for a regulation Article 13 – paragraph 1 – point d (d) a list of all available data other than content data related to the potential online child sexual abuse preserved in line with the preservation order in Article 8a;
Amendment 547 #
Proposal for a regulation Article 13 – paragraph 1 – point d a (new) (d a) a list of all traffic data and metadata retained in relation to the potential online child sexual abuse, which could be made available to law enforcement authorities, together with information concerning default storage periods.
Amendment 548 #
Proposal for a regulation Article 13 – paragraph 1 – point e Amendment 549 #
Proposal for a regulation Article 13 – paragraph 1 – point f Amendment 550 #
Proposal for a regulation Article 13 – paragraph 1 – point g Amendment 551 #
Proposal for a regulation Article 13 – paragraph 1 – point i (i) where the
Amendment 552 #
Proposal for a regulation Article 13 – paragraph 1 – point j (j)
Amendment 553 #
Proposal for a regulation Article 14 – paragraph 1 1.
Amendment 554 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 555 #
Proposal for a regulation Article 14 – paragraph 1 a (new) 1 a. Before issuing a removal order, the Coordinating Authority of establishment shall take all reasonable steps to ensure that implementing the order will not interfere with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 556 #
Proposal for a regulation Article 14 – paragraph 2 Amendment 557 #
Proposal for a regulation Article 14 – paragraph 2 2. The provider shall execute the removal order as soon as possible and in any event within no more than 24 hours of receipt thereof.
Amendment 558 #
Proposal for a regulation Article 14 – paragraph 3 – introductory part 3. The competent judicial authority
Amendment 559 #
Proposal for a regulation Article 14 – paragraph 3 – point a (a) identification details of the judicial
Amendment 560 #
Proposal for a regulation Article 14 – paragraph 3 – point b (b) the name of the provider and, where applicable, of its legal representative, without prejudice to the issuance of removal orders where the legal name of the provider is not readily ascertained;
Amendment 561 #
Proposal for a regulation Article 14 – paragraph 3 – point c Amendment 562 #
Proposal for a regulation Article 14 – paragraph 3 – point h (h) the date, time stamp and electronic signature of the judicial
Amendment 563 #
Proposal for a regulation Article 14 – paragraph 3 a (new) 3 a. Providers of hosting services or providers of interpersonal communication services shall be encouraged to extend the effect of the order regarding one or more specific items of material, referred to in paragraph 1, to any provider or services under their control and promptly inform the Coordinating Authority of establishment of this specific measure.
Amendment 564 #
Proposal for a regulation Article 15 – paragraph 1 1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority
Amendment 565 #
Proposal for a regulation Article 15 – paragraph 1 a (new) Amendment 566 #
Proposal for a regulation Article 15 – paragraph 2 – subparagraph 1 When the removal order becomes final, the competent judicial authority
Amendment 567 #
Proposal for a regulation Article 15 – paragraph 3 – point b (b) the reasons for the removal or disabling, providing a copy of the removal order
Amendment 568 #
Proposal for a regulation Article 15 – paragraph 4 Amendment 569 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 1 The Coordinating Authority of establishment may request, when requesting the judicial authority
Amendment 570 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 2 – point a (a) the judicial authority
Amendment 571 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 2 – point c (c) that judicial authority
Amendment 572 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 3 That judicial authority
Amendment 573 #
Proposal for a regulation Article 15 a (new) Article 15 a Delisting orders 1. The competent authority shall have the power to issue an order requiring a provider of online search engines under the jurisdiction of that Member State to take reasonable measures to delist a Uniform Resource Locator corresponding to online locations where child sexual abuse material can be found from appearing in search results. 2. The provider shall execute the delisting order without undue delay. The provider shall take the necessary measures to ensure that it is capable of reinstating the Uniform Resource Locator to appear in search results. 3. Before issuing a delisting order, the issuing authority shall inform the provider, if necessary via the Coordinating Authority, of its intention to do so specifying the main elements of the content of the intended delisting order and the reasons for its intention. It shall afford the provider an opportunity to comment on that information, within a reasonable time period set by that authority. 4. A delisting order shall be issued where the following conditions are met: (a) the delisting is necessary to prevent the dissemination of the child sexual abuse material in the Union, having regard in particular to the need to protect the rights of the victims; (b) all necessary investigations and assessments, including of search results, have been carried out to ensure that the Uniform Resource Locator to be delisted correspond, in a sufficiently reliable manner, to online locations where child sexual abuse material can be found. 5. The issuing authority shall specify in the delisting order the period during which it applies, indicating the start date and the end date. The period of application of delisting orders shall not exceed five years. 6. The Coordinating Authority or the issuing authority shall, where necessary and at least once every year, assess whether any substantial changes to the grounds for issuing the delisting orders have occurred and whether the conditions of paragraph 4 continue to be met.
Amendment 574 #
Proposal for a regulation Article 15 b (new) Article 15 b Redress and provision of information 1. Providers of online search engines that have received a delisting order shall have a right to effective redress. That right shall include the right to challenge the delisting order before the courts of the Member State of the authority that issued the delisting order. 2. If the order is modified or repealed as a result of a redress procedure, the provider shall immediately reinstate the delisted Uniform Resource Locator to appearing in search results. 3. When the delisting order becomes final, the issuing authority shall, without undue delay, transmit a copy thereof to the Coordinating Authority. The Coordinating Authority shall then, without undue delay, transmit copies thereof to all other Coordinating Authorities and the EU Centre through the system established in accordance with Article 39(2). For the purpose of the first subparagraph, a delisting order shall become final upon the expiry of the time period for appeal where no appeal has been lodged in accordance with national law or upon confirmation of the delisting order following an appeal. 4. Where a provider prevents users from obtaining search results for child sexual abuse material corresponding to Uniform Resource Locator pursuant to a delisting order, it shall take reasonable measures to inform those users of the following: (a) the fact that it does so pursuant to a delisting order; (b) the right of providers of delisted Uniform Resource Locators corresponding to blocked online locations to judicial redress referred to in paragraph 1 and the users’ right to submit complaints to the Coordinating Authority in accordance with Article 34.
Amendment 575 #
Proposal for a regulation Article 19 Amendment 576 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation
Amendment 577 #
Proposal for a regulation Article 19 – paragraph 1 Amendment 578 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services and where applicable cloud computing services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 579 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services and where applicable cloud computing services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
Amendment 580 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters
Amendment 581 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to efficiently handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and communicate it to the EU Centre. They shall keep that information updated.
Amendment 582 #
Proposal for a regulation Article 25 – paragraph 7 – introductory part 7. Coordinating Authorities may, where necessary for the performance of their tasks under this Regulation, request the assistance of the EU Centre in carrying out those tasks
Amendment 583 #
Proposal for a regulation Article 25 – paragraph 7 – point a Amendment 584 #
Proposal for a regulation Article 25 – paragraph 7 – point b Amendment 585 #
Proposal for a regulation Article 25 – paragraph 7 – point c Amendment 586 #
Proposal for a regulation Article 25 – paragraph 7 – point d Amendment 587 #
Proposal for a regulation Article 25 – paragraph 8 8. The EU Centre shall provide such assistance without undue delay, free of charge and in accordance with its tasks and obligations under this Regulation and insofar as its resources and priorities allow.
Amendment 588 #
Proposal for a regulation Article 25 – paragraph 8 8. The EU Centre shall provide such assistance free of charge and in accordance with its tasks and obligations under this Regulation
Amendment 589 #
Proposal for a regulation Article 26 – paragraph 1 1. Member States shall ensure that the Coordinating Authorities that they designated perform their tasks under this Regulation in an objective, impartial, transparent and timely manner, while fully respecting the fundamental rights of all parties affected. Member States shall
Amendment 590 #
Proposal for a regulation Article 26 – paragraph 2 – introductory part 2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the
Amendment 591 #
Proposal for a regulation Article 26 – paragraph 2 – point a Amendment 592 #
Proposal for a regulation Article 26 – paragraph 2 – point a Amendment 593 #
Proposal for a regulation Article 26 – paragraph 2 – point d (d) neither seek nor take instructions from any
Amendment 594 #
Proposal for a regulation Article 26 – paragraph 2 – point e Amendment 595 #
Proposal for a regulation Article 26 – paragraph 2 – point e Amendment 596 #
Proposal for a regulation Article 26 – paragraph 3 Amendment 597 #
Proposal for a regulation Article 26 – paragraph 3 3. Paragraph 2 shall not prevent supervision of the Coordinating Authorities in accordance with national constitutional law
Amendment 598 #
Proposal for a regulation Article 26 – paragraph 4 4. The Coordinating Authorities shall ensure that relevant members of staff have the required qualifications, experience, integrity and technical skills to perform their duties.
Amendment 599 #
Proposal for a regulation Article 26 – paragraph 5 5. Without prejudice to national or Union legislation on whistleblower protection, The management and other staff of the Coordinating Authorities shall, in accordance with Union or national law, be subject to a duty of professional secrecy both during and after their term of office, with regard to any confidential information which has come to their knowledge in the course of the performance of their tasks. Member States shall ensure that the management and other staff are subject to rules guaranteeing that they can carry out their tasks in an objective, impartial and independent manner, in particular as regards their appointment, dismissal, remuneration and career prospects.
Amendment 600 #
Proposal for a regulation Article 27 – paragraph 1 – point a (a) the power to require those providers, as well as any other persons
Amendment 601 #
Proposal for a regulation Article 27 – paragraph 1 – point b (b) the power to carry out on-site inspections of any premises that those providers or the other persons referred to in point (a) use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement of this Regulation in any form, irrespective of the storage medium, excluding content protected by confidentiality of correspondence for which authorisation by a judicial authority is required;
Amendment 602 #
Proposal for a regulation Article 27 – paragraph 1 – point b (b) the power to carry out remote or on-site
Amendment 603 #
Proposal for a regulation Article 27 – paragraph 1 – point d (d) the power to request information,
Amendment 604 #
Proposal for a regulation Article 28 – paragraph 1 – point b (b) the power to order specific measures to bring about the cessation of infringements of this Regulation and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end;
Amendment 605 #
Proposal for a regulation Article 29 – paragraph 1 – point b (b) the infringement persists and;
Amendment 606 #
Proposal for a regulation Article 29 – paragraph 2 – point a – point i (i) adopt and submit an action plan setting out the necessary measures to terminate the infringement , subject to the approval of the Coordinating Authority;
Amendment 607 #
Proposal for a regulation Article 29 – paragraph 2 – point b – introductory part (b) request the competent judicial authority
Amendment 608 #
Proposal for a regulation Article 29 – paragraph 2 – point b – point ii (ii) the infringement persists and causes serious harm that is greater than the likely harm to users relying on the service for legal purposes and;
Amendment 609 #
Proposal for a regulation Article 29 – paragraph 4 – subparagraph 3 – point a (a) the provider has failed to take
Amendment 610 #
Proposal for a regulation Article 30 – paragraph 2 2. Member States shall ensure that any exercise of the investigatory and enforcement powers referred to in Articles
Amendment 611 #
Proposal for a regulation Article 31 – paragraph 1 Coordinating Authorities shall have the power to carry out searches on publicly accessible material on hosting services to detect the dissemination of known or new child sexual abuse material, using the indicators contained in the databases referred to in Article 44(1), points (a) and (b)
Amendment 612 #
Proposal for a regulation Article 32 Amendment 613 #
Proposal for a regulation Article 32 a (new) Article 32 a Public awareness campaigns Coordinating authorities shall in cooperation with the EU Center regularly carry out public awareness campaigns to inform about measures to prevent and combat child sexual abuse online and offline and how to seek child-fiendly and age appropriate reporting and assistance and to inform about victims rights.
Amendment 614 #
Proposal for a regulation Article 34 – paragraph 2 a (new) 2a. Where national law does not grant a minor the legal capacity to lodge a complaint, his or her legal representative may do so on his or her behalf.
Amendment 615 #
Proposal for a regulation Article 35 – paragraph 2 2. Member States shall ensure that the maximum amount of penalties imposed for an infringement of this Regulation shall not exceed 6 % of the annual
Amendment 616 #
Proposal for a regulation Article 35 – paragraph 3 3. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify
Amendment 617 #
Proposal for a regulation Article 35 – paragraph 4 4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily
Amendment 618 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – introductory part Coordinating Authorities shall submit to the EU Centre, without undue delay and through the system established in accordance with Article 39(2), the evidence gathered through the procedures provided for in this Regulation:
Amendment 619 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point a (a) anonymised specific items of material and transcripts of conversations related to a specific person, specific group of people,or specific incident that Coordinating Authorities or that the competent judicial authorities or other independent administrative authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material or the solicitation of children, as applicable, for the EU Centre to generate indicators in accordance with Article 44(3);
Amendment 620 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point b (b) exact uniform resource locators indicating specific items of material related to a specific person, specific group of people,or specific incident that Coordinating Authorities or that competent judicial authorities or other independent administrative authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material, hosted by providers of hosting services not offering services in the Union, that cannot be removed due to those providers’ refusal to remove or disable access thereto and to the lack of cooperation by the competent authorities of the third country having jurisdiction, for
Amendment 621 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 2 Member States shall take the necessary measures to ensure that the Coordinating Authorities that they designated receive, without undue delay,the encrypted copies of the material identified as child sexual abuse material, the transcripts of conversations related to a specific person, specific group of people,or specific incident identified as the solicitation of children, and the uniform resource locators, identified by a competent judicial authority or other independent administrative authority than the Coordinating Authority, for submission to the EU Centre in accordance with the first subparagraph.
Amendment 622 #
Proposal for a regulation Article 37 – paragraph 1 – subparagraph 2 Where
Amendment 623 #
Proposal for a regulation Article 37 – paragraph 2 – point c (c) any other information that the Coordinating Authority that sent the request, or the Commission, considers relevant, including, where appropriate, information gathered on its own initiative
Amendment 624 #
Proposal for a regulation Article 37 – paragraph 3 – subparagraph 1 The Coordinating Authority of establishment shall assess the suspected infringement, taking into utmost account the request
Amendment 625 #
Proposal for a regulation Article 37 – paragraph 3 – subparagraph 2 Where it considers that it has insufficient information to asses the suspected infringement or to act upon the request
Amendment 626 #
Proposal for a regulation Article 37 – paragraph 4 4. The Coordinating Authority of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation referred to in paragraph 1, communicate to the Coordinating Authority that sent the request, or the Commission, the outcome of its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and, where applicable,
Amendment 627 #
Proposal for a regulation Article 38 – paragraph 2 a (new) 2 a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of online child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to efficiently detect, remove and block content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
Amendment 628 #
Proposal for a regulation Article 39 – paragraph 1 1. Coordinating Authorities shall efficiently cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre
Amendment 629 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services.
Amendment 630 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services.
Amendment 631 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies, hotlines and providers of relevant information society services.
Amendment 632 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.
Amendment 633 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.
Amendment 634 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies, hotlines and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.
Amendment 635 #
Proposal for a regulation Article 39 – paragraph 3 a (new) 3 a. Where the EU Centre receives a report from a hotline, or where a provider that submitted the report to the EU Centre has indicated that the report is based on the information received from a hotline, the EU Centre shall coordinate with the relevant Coordinating Authorities in order avoid duplicated reporting on the same material that has already been reported to the national law enforcement authorities by the hotlines and monitor the removal of the child sexual abuse material or cooperate with the relevant hotline to track the status.
Amendment 636 #
Proposal for a regulation Article 55 – paragraph 1 – point d a (new) (d a) a Survivors‘ Advisory Board as an advisory group, which shall exercise the tasks set out in Article 66a (new).
Amendment 637 #
Proposal for a regulation Article 57 – paragraph 1 – point c (c) adopt rules for the prevention and management of conflicts of interest in respect of its members, as well as for the members of the Technological Committee and of
Amendment 638 #
Proposal for a regulation Article 57 – paragraph 1 – point f (f) appoint the members of the Technology Committee
Amendment 639 #
Proposal for a regulation Article 57 – paragraph 1 – point h a (new) (h a) consult the Survivors’ Advisory Board as regards the obligations referred to in points (a) and (h) of this Article.
Amendment 640 #
Proposal for a regulation Article 66 a (new) Amendment 641 #
Proposal for a regulation Article 83 – paragraph 1 – introductory part 1. Providers of hosting services, providers of publicly available number- independent interpersonal communications services and
Amendment 642 #
Proposal for a regulation Article 83 – paragraph 1 – point a – introductory part (a) where the provider has been subject to a
Amendment 643 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 1 — the measures taken to comply with the order,
Amendment 644 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 2 — the
Amendment 645 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 3 — in relation to complaints and cases submitted by users in connection to the measures taken to comply with the order, the number of complaints submitted directly to the provider, the number of cases brought before a judicial authority, the basis for those complaints and cases, the decisions taken in respect of those complaints and in those cases, the
Amendment 646 #
Proposal for a regulation Article 83 – paragraph 1 – point b (b) the number of removal orders issued to the provider in accordance with Article 14 and the average time
Amendment 647 #
Proposal for a regulation Article 83 – paragraph 1 – point b (b) the number of removal orders issued to the provider in accordance with Article 14 and the
Amendment 648 #
Proposal for a regulation Article 83 – paragraph 1 – point b a (new) (b a) the number and duration of delays to removals as a result of requests from competent authorities or law enforcement authorities;
Amendment 649 #
Proposal for a regulation Article 83 – paragraph 1 – point c (c) the total number of items of child sexual abuse material that the provider removed or to which it disabled access, broken down by whether the items were removed or access thereto was disabled pursuant to a removal order or to a notice submitted by a judicial authority, Competent Authority, the EU Centre
Amendment 650 #
Proposal for a regulation Article 83 – paragraph 1 – point c a (new) (c a) The number of instances the provider was asked to provide additional support to law enforcement authorities in relation to content that was removed;
Amendment 651 #
Proposal for a regulation Article 83 – paragraph 1 – point d Amendment 652 #
Proposal for a regulation Article 83 – paragraph 2 – introductory part 2. The Coordinating Authorities shall collect data on the following topics and make that information publicly available redacting operationally sensitive data as appropriate and proving an unredacted version to the EU Centre
Amendment 653 #
Proposal for a regulation Article 83 – paragraph 2 – point a – indent 4 a (new) - the nature of the report and its key characteristics such as if the security of the hosting service was allegedly breached;
Amendment 654 #
Proposal for a regulation Article 83 – paragraph 2 – point b (b) the most important and recurrent risks of online child sexual abuse encountered , as reported by providers of hosting services and providers of publicly available number -independent interpersonal communications services in accordance with Article 3 or identified through other information available to the Coordinating Authority;
Amendment 655 #
Proposal for a regulation Article 83 – paragraph 2 – point c (c) a list of the providers of hosting services and providers of interpersonal communications services to which the Coordinating Authority addressed a
Amendment 656 #
Proposal for a regulation Article 83 – paragraph 2 – point d (d) the number of
Amendment 657 #
Proposal for a regulation Article 83 – paragraph 2 – point f (f) the number of removal orders issued in accordance with Article 14, broken down by provider, the time needed to remove or disable access to the item or items of child sexual abuse material concerned, , including the time it took the Coordinating Authority to process the order and the number of instances in which the provider invoked Article 14(5) and (6);
Amendment 658 #
Proposal for a regulation Article 83 – paragraph 2 – point g Amendment 659 #
Proposal for a regulation Article 83 – paragraph 3 – introductory part 3. The EU Centre shall collect data and generate statistics on the
Amendment 660 #
Proposal for a regulation Article 83 – paragraph 3 – point a (a) the number of indicators in the databases of indicators referred to in Article 44 and the
Amendment 661 #
Proposal for a regulation Article 83 – paragraph 3 – point b (b) the number of submissions of child sexual abuse material and solicitation of children referred to in Article 36(1), broken down by Member State that designated the submitting Coordinating Authorities, and,
Amendment 662 #
Proposal for a regulation Article 83 – paragraph 3 – point c (c) the total number of reports submitted to the EU Centre in accordance with Article 12, broken down by provider of hosting services and provider of publicly available number-independent interpersonal communications services that submitted the report and by Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3);
Amendment 663 #
Proposal for a regulation Article 83 – paragraph 3 – point d (d) the online child sexual abuse to which the reports relate, including the number of items of potential
Amendment 664 #
Proposal for a regulation Article 83 – paragraph 3 – point e (e) the number of reports that the EU Centre considered unfounded or manifestly unfounded, as referred to in Article 48(2);
Amendment 665 #
Proposal for a regulation Article 83 – paragraph 3 – point f (f) the number of reports relating to potential
Amendment 666 #
Proposal for a regulation Article 83 – paragraph 3 – point h (h) where materially the same item of potential child sexual abuse material was reported more than once to the EU Centre in accordance with Article 12 or detected more than once through the searches in accordance with Article 49(1), the number of times that that item was reported or detected in that manner.
Amendment 667 #
Proposal for a regulation Article 83 – paragraph 4 4. The providers of hosting services, providers of interpersonal communications services and providers of internet access services, the Coordinating Authorities and the EU Centre shall ensure that the data
Amendment 668 #
Proposal for a regulation Article 83 – paragraph 5 5. They shall ensure that the data is stored in a secure manner and that the storage is subject to appropriate technical
Amendment 669 #
Proposal for a regulation Article 84 – paragraph 1 a (new) 1 a. The annual report shall also include the number of users affected by detection and removal orders.
Amendment 670 #
Proposal for a regulation Article 85 – paragraph 1 1. By [five years after the entry into force of this Regulation], and every five years thereafter, the Commission shall evaluate this Regulation and submit a report on its application to the European Parliament and the Council. This report shall address in particular the possible use of new technologies for a safe and trusted processing of personal and other data and for the purpose of combating online child sexual abuse and in particular to detect, report and remove online child sexual abuse. The report shall be accompanied, where appropriate, by a legislative proposal.
source: 745.291
2023/03/28
BUDG
37 amendments...
Amendment 57 #
Proposal for a regulation Recital 1 a (new) (1 a) The role of prevention should be emphasized by vesting children, parents and caregivers with the necessary instruments in order to develop situational awareness of the online environment, evaluate potential risks and support children in being safe online. In this regard, education facilities should have a greater role in contributing to this scope, reason for which civic education classes should also provide for the attainment of safe internet skills for children.
Amendment 58 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat online child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology- neutral and future-
Amendment 59 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question in a timely manner. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. .
Amendment 60 #
Proposal for a regulation Recital 1 a (new) (1 a) The role of prevention should be emphasized by vesting children, parents and caregivers with the necessary instruments in order to develop situational awareness of the online environment, evaluate potential risks and support children in being safe online. In this regard, education facilities should have a greater role in contributing to this scope, reason for which civic education classes should also provide for the attainment of safe internet skills for children.
Amendment 61 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat online child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology- neutral and future-
Amendment 62 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question in a timely manner. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. .
Amendment 63 #
Proposal for a regulation Recital 59 (59) To support the implementation of this Regulation and contribute to the achievement of its objectives, the EU Centre should serve as a central facilitator, carrying out a range of specific tasks. The performance of those tasks requires strong guarantees of independence, in particular from law enforcement authorities,
Amendment 64 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the
Amendment 65 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned. For this scope, the EU Centre can also aid in the implementation of awareness campaigns and contribute to the establishment and improvement of specific guidelines and proposals for mitigation measures respectively, so as to ensure accuracy and up to date solutions in tackling online child sexual abuse.
Amendment 66 #
Proposal for a regulation Recital 67 a (new) (67 a) In carrying out its mission, the EU Centre should also ensure transversal cooperation with education facilities, where appropriate, and digital education hubs, to also integrate this dimension of the prevention component, in order for children to become aware of the potential risks posed by the online environment.
Amendment 67 #
Proposal for a regulation Recital 67 b (new) (67 b) Considering the essential role teachers can play in guiding children on safely using information society services and detecting potentially malicious behaviour online, teacher training should be organized and implemented across the Union, in a coherent manner, benefitting from the knowledge and expertise of the EU Centre.
Amendment 68 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Furthermore, a special green line with a call centre assistance service will be constituted at EU level in order for victims and their families to receive support in a timely manner.
Amendment 69 #
Proposal for a regulation Recital 70 a (new) Amendment 70 #
Proposal for a regulation Recital 72 a (new) (72 a) In view of ensuring an adequate degree of expertise and skills for investigative purposes, specialized training of law enforcement officers will be introduced with the support of the EU Centre, especially considering rapid technological advancements where new methods, techniques and instruments require adapting preventive and mitigation efforts regarding online child sexual abuse.
Amendment 71 #
Proposal for a regulation Recital 74 a (new) (74 a) In view of the need for a more effective EU Centre it is necessary to establish a Survivors' Advisory Board.Through the structured involvement of victims and former victims of sexualised violence, the EU Centre should serve as a platform to offer holistic support for the fight against child sexual abuse in all Member States. The Survivors’ Advisory Council may support the EU Centre’s activities to facilitate cross-border cooperation for existing national networks and the exchange of best practice. It may also raise awareness for child sexual abuse by serving as a knowledge platform through the coordination, collection and synthethis of research.
Amendment 72 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise, provided the request is reasonably necessary to support the risk assessment. The requests shall not be seen as either administrative or economical burden for these enterprises.
Amendment 73 #
The Commission shall be empowered to adopt delegated acts as soon as possible in accordance with Article 86 in order to supplement this Regulation with the necessary detailed rules on the determination and charging of those costs and the application of the exemption for micro, small and medium-
Amendment 74 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, in a timely manner, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 75 #
Proposal for a regulation Article 21 – paragraph 2 – point 1 (new) Amendment 76 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point a (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(8), Article 4(5), Article 6(4) and Article 11, including by collecting and providing relevant information, expertise and best practices, taking into account advice from the Technology Committee and the Survivors’ Advisory Board referred to in Articles 66 and 66a (new);
Amendment 77 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a (a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51, including in view of updating guidelines on prevention and mitigation methods for combatting child sexual abuse, especially for the digital dimension as per new technological developments;
Amendment 78 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (b a) contribute to the implementation of awareness campaigns as per the potential risks posed by the online environment to children, in order to equip them with adequate skills for detecting potential grooming and deceit, to ensure safe use of the internet by children and to better implement the prevention component of online child sexual abuse;
Amendment 79 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) (b b) assisting with expertise and knowledge in the development and implementation of teacher training across the Union, in order to vest teachers with the necessary skills for guiding children on safely using information society services and detecting potentially malicious behaviour online;
Amendment 80 #
Proposal for a regulation Article 55 – paragraph 1 – point d a (new) (d a) a Survivors’ Advisory Board which shall exercise the tasks set out in Article 66a (new).
Amendment 81 #
Proposal for a regulation Article 56 – paragraph 4 4. Members of the Management Board and their alternates shall be appointed in the light of their knowledge in the field of combating child sexual abuse, taking into account relevant managerial, administrative and budgetary skills. Member States shall appoint a representative of their Coordinating Authority, within four months of [date of entry into force of this Regulation]. All parties represented in the Management Board shall make efforts to limit turnover of their representatives, in order to ensure continuity of its work. All parties sh
Amendment 82 #
Proposal for a regulation Article 57 – paragraph 1 – point c (c) adopt rules for the prevention and management of conflicts of interest in respect of its members, as well as for the members of the Technological Committee and of
Amendment 83 #
Proposal for a regulation Article 57 – paragraph 1 – point f (f) appoint the members of the Technology Committee, and of
Amendment 84 #
Proposal for a regulation Article 57 – paragraph 1 – point h a (new) (h a) consult the Survivors’ Advisory Board as regards the obligations referred to in points (a), (g) and (h) of this Article.
Amendment 85 #
Proposal for a regulation Article 64 – paragraph 4 – point f (f) preparing the Consolidated Annual Activity Report (CAAR) on the EU Centre’s activities, including the activities of the Technology Committee and the Survivors’ Advisory Board, and presenting it to the Executive Board for assessment and adoption;
Amendment 86 #
Proposal for a regulation Article 64 – paragraph 5 5. Where exceptional circumstances so require, the Executive Director may decide to locate one or more staff in another Member State for the purpose of carrying out the EU Centre’s tasks in an a more efficient, effective and coherent manner according to principles of good governance. Before deciding to establish a local office, the Executive Director shall obtain the prior consent of the Commission, the Management Board and the Member State concerned. The decision shall be based on an appropriate cost- benefit analysis that demonstrates in particular the added value of such decision and specify the scope of the activities to be carried out at the local office in a manner that avoids unnecessary costs and duplication of administrative functions of the EU Centre. A headquarters agreement with the Member State(s) concerned may be concluded.
Amendment 87 #
Proposal for a regulation Article 65 – paragraph 2 2. The Executive Director shall be appointed by the
Amendment 88 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical experts appointed by the Management Board in view of their excellence and their independence from corporate interests, following the publication of a call for expressions of interest in the Official Journal of the European Union.
Amendment 89 #
Proposal for a regulation Article 66 – paragraph 2 2. Procedures concerning the appointment of the members of the Technology Committee and its operation shall be further specified in the rules of procedure of the Management Board and shall be made public.
Amendment 90 #
Proposal for a regulation Article 66 – paragraph 4 4. When a member no longer meets the criteria of independence, he or she shall inform the Management Board. Alternatively, the Management Board may declare, on a proposal of at least one third of its members or of the Commission, a lack of independence and revoke the appointment of the person concerned. The Management Board shall appoint a new member for the remaining term of office in accordance with the procedure for ordinary members.
Amendment 91 #
Proposal for a regulation Article 66 – paragraph 6 – point b a (new) (b a) provide an annual acitvity report to the Exectuive Director as part of the Consolidated Annual Activity Report;
Amendment 92 #
Proposal for a regulation Article 66 a (new) Amendment 93 #
Proposal for a regulation Article 69 – paragraph 4 4. The EU Centre’s expenditure shall include staff remuneration, administrative and infrastructure expenses, and operating costs while following the appropriate EU budgetary rules.
source: 745.269
2023/05/08
FEMM
491 amendments...
Amendment 100 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available number-independent interpersonal communications services to execute
Amendment 101 #
Proposal for a regulation Recital 26 a (new) (26a) Detection of child sexual abuse in end-to-end encrypted communications is only possible by scanning those communications before they leave the abuser's device, however this would allow abusers to interfere with the scanning process. Abusers often work in groups, allowing for rapid proliferation of technology to bypass scanning, rendering such scanning ineffective. Therefore, taking into account the limited efficacy, and the negative impact on citizens' fundamental rights, detection orders should not be applicable to end-to-end encrypted communications.
Amendment 102 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the
Amendment 103 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to
Amendment 104 #
Proposal for a regulation Recital 28 (28) With a view to constantly assess the performance of the
Amendment 105 #
Proposal for a regulation Recital 29 (29)
Amendment 106 #
Proposal for a regulation Recital 30 (30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the
Amendment 107 #
Proposal for a regulation Recital 32 (32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited. For legal and practical reasons, it may not be reasonably possible to have those providers remove or disable access to the material, not even through cooperation with the competent authorities of the third country where they are established. Therefore, in line with existing practices in several Member States, it should be possible to require providers of internet access services to take reasonable measures to block the access of users in the Union to the material. However, blocking measures are easily bypassed, and do not prevent access from outside of the Union, meaning victims have to live knowing that abuse material depicting them remains online, therefore every effort should be taken to remove material, even outside of the jurisdiction of the Union, before resorting to blocking.
Amendment 108 #
Proposal for a regulation Recital 32 (32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited.
Amendment 109 #
Proposal for a regulation Recital 33 (33) In the interest of consistency, efficiency and effectiveness and to minimise the risk of circumvention, such
Amendment 110 #
Proposal for a regulation Recital 34 Amendment 111 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available interpersonal communications services in accordance with this Regulation. Online service providers, including social network platforms, should adopt mandatory procedures in order to effectively prevent, detect and report child sexual abuse that occurs on their services and remove child sexual abuse material
Amendment 112 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims or their approved formal representative should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported or has been removed by providers of hosting services or providers of publicly available interpersonal communications services in accordance with this Regulation. This should both include the option for a singular information request, as the option to receive this information on a continuous and regular basis.
Amendment 113 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted, whom to the vast majority are girls. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant and age-appropriate information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available interpersonal communications services in
Amendment 114 #
Proposal for a regulation Recital 35 a (new) Amendment 115 #
Proposal for a regulation Recital 36 (36) In order to prevent children falling victim to online abuse, providers for which there is evidence that their service is routinely or systematically used for the purpose of online child sexual abuse in line with article 3, should provide reasonable assistance, by putting in place alert and alarm mechanisms in a prominent way on their platforms. The alert mechanism could consist of, for example, linking potential victims to the local services such as helplines, victims` rights and support organisations or hotlines. They should ensure adequate follow-up, when a report or alert is made, in the language chosen by the user for using their service. Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the
Amendment 116 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. The holders of parental responsibility for the victims or the legal guardians of the victims should have equal legal standing to exercise victim's rights when the victim is not able to do so, due to age or other limitations. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. Such assistance should be tailored to the specific vulnerabilities of the victims, such as age, or disability, in a gender sensitive way. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities.
Amendment 117 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities, taking into account vulnerabilities and psychological effects on victims.
Amendment 118 #
Proposal for a regulation Recital 37 (37) To ensure the efficient management of such victim support functions, victims should be allowed to contact and rely on the Coordinating Authority that is most accessible to them, which should channel all communications between victims and the EU Centre. Coordinating authorities should provide gender- and age- sensitive support to victims, as well as psychological support. Under no circumstances should victims be blamed for what has happened to them.
Amendment 119 #
Proposal for a regulation Recital 37 a (new) (37a) Member States should ensure and safeguard the existence of effective mechanisms for reporting child sexual abuse and that such investigative tools are effectively used to identify victims and rescue them as quickly as possible from ongoing abuse;
Amendment 120 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to the application of this Regulation,
Amendment 121 #
Proposal for a regulation Recital 50 Amendment 122 #
Proposal for a regulation Recital 52 (52) To ensure effective enforcement and the safeguarding of users’ rights under this Regulation, it is appropriate to facilitate the lodging of complaints about alleged non-compliance with obligations o
Amendment 123 #
Proposal for a regulation Recital 55 (55) It is essential for the proper functioning of
Amendment 124 #
Proposal for a regulation Recital 55 a (new) (55a) All communications containing illegal material should be encrypted to state of the art standards, all access by staff to such content should be limited to what is necessary and thoroughly logged. All such logs should be stored for a minimum of ten years.
Amendment 125 #
Proposal for a regulation Recital 59 (59) To support the implementation of this Regulation and contribute to the achievement of its objectives, the EU
Amendment 126 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in
Amendment 127 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the
Amendment 128 #
Proposal for a regulation Recital 61 (61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material)
Amendment 129 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of hosting services for the dissemination of known child sexual abuse material that is publicly accessible, using the corresponding indicators. Where it identifies such material after having conducted such a search, the EU Centre should also be able to request the provider of the hosting service concerned to remove or disable access to the item or items in question, given that the provider may not be aware of their presence and may be willing to do so on a voluntary basis. The EU Centre must be able to work in collaboration with, and refer child victims to, relevant competent authorities and support services, such as victim protection centres, women’s shelters, children’s specialised services, social services, children’s rights organisations and family associations, as well as healthcare professionals in the Member States.
Amendment 130 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of hosting services for the dissemination of known child sexual abuse material that is publicly accessible, using the corresponding indicators. Where it identifies such material after having conducted such a search, the EU Centre should also be able to request the provider of the hosting service concerned to remove or disable access to the item or items in question, given that the provider may not be aware of their presence and may be willing to do so on a voluntary basis. The EU Centre should support Member States in conducting studies, with nationally representative samples, on child sexual abuse in their socialisation spaces, in order to structure preventive and multidisciplinary response measures.
Amendment 131 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as
Amendment 132 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and
Amendment 133 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned. The EU centre shall also provide knowledge, expertise and best practice on preventive measures targeted at abusers.
Amendment 134 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Child helplines are equally in the frontline in the fight against online child sexual abuse. Therefore, the EU Centre should also recognise the work of child helplines in victim response, and the existing referral mechanisms between child helplines and hotlines. The EU Centre should coordinate services for victims.
Amendment 135 #
Proposal for a regulation Recital 71 Amendment 136 #
Proposal for a regulation Recital 74 a (new) (74a) In view of the need for a more effective EU Centre it is necessary to establish a Survivors' Advisory Board. Through the structured involvement of victims and former victims of sexualised violence and experts on this matter, the EU Centre should serve as a platform to offer holistic support for the fight against child sexual abuse in all Member States. The Survivors’ Advisory Council may support the EU Centre’s activities to facilitate cross-border cooperation for existing national networks and the exchange of best practice. It may also raise awareness for child sexual abuse by serving as a knowledge platform through the coordination, collection and synthethis of research.
Amendment 137 #
Proposal for a regulation Recital 74 a (new) (74a) Given the purpose of this regulation, namely to combat and prevent child sexual abuse, the EU Centre should have a Children’s Rights and Survivors Advisory Board composed of experts, including specialist child psychiatrists and representatives of family associations, with an advisory function relating to children’s rights and the victims’ and survivors’ perspective. The Children’s Rights and Survivors Advisory Board may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate.
Amendment 138 #
Proposal for a regulation Recital 74 a (new) (74a) The Victims' Consultative Forum should be the EU Center's advisory body and support its work. Its principle function should be to provide independent advice through expertise knowledge, deriving from victims of sexual abuse online and taking into account the views of the children that will be consulted as well, in a child-friendly and child- sensitive manner on relevant issues.
Amendment 139 #
Proposal for a regulation Recital 75 (75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse data disaggregated by gender, age and social, cultural and economic background as well as information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question in gathering that information.
Amendment 140 #
Proposal for a regulation Recital 75 (75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect gender-disaggregated and age specific data, record and analyse information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and
Amendment 141 #
Proposal for a regulation Recital 77 (77) The evaluation should be based on the criteria of efficiency, necessity, effectiveness, proportionality, relevance, coherence and Union added value. It should assess the functioning of the different operational and technical measures provided for by this Regulation, including the effectiveness of measures to enhance the detection, reporting and removal of online child sexual abuse, the effectiveness of safeguard mechanisms as well as the impacts on potentially affected fundamental rights, children’s rights the freedom to conduct a business, the right to private life and the protection of personal data. The Commission should also assess the impact on potentially affected interests of third parties.
Amendment 142 #
Proposal for a regulation Recital 84 a (new) (84a) Recommends that companies operating social platforms and their security systems place greater emphasis on regulating the registration of children and minors on social media platforms, focusing particularly on the needs of people living in poverty, Roma and other minorities to combat differences in digital literacy and reduce the volume of violence in the online space.
Amendment 143 #
Proposal for a regulation Recital 84 b (new) (84b) Recommends that the EU centre should develop specific action plans in the field of digital education, focusing on children facing disadvantages and multiple disadvantages, and specifically targeting solutions to the digital divide.
Amendment 144 #
Proposal for a regulation Article 2 – paragraph 1 – point b a (new) (ba) 'safety assistant' means a tool integrated into interpersonal communications services either voluntarily or following a preventative detection order, and active only for child users of the service, which assists children in learning about, identifying and avoiding risks online, including but not limited to self-generated abuse material and solicitation;
Amendment 145 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 1
Amendment 146 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 1
Amendment 147 #
- ‘victim’ means the child or person having suffered harm caused after being subject to ‘child sexual abuse material’ or ‘solicitation of children’ or ‘online sexual abuse’ or ‘child sexual abuse offences’;
Amendment 148 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of
Amendment 149 #
Proposal for a regulation Article 3 – paragraph 1 a (new) Amendment 150 #
Proposal for a regulation Article 3 – paragraph 2 – point a Amendment 151 #
Proposal for a regulation Article 3 – paragraph 2 – point a a (new) (aa) any actual or foreseeable negative effects for the exercise of fundamental rights or possible breaches of EU law
Amendment 152 #
Proposal for a regulation Article 3 – paragraph 2 – point a b (new) (ab) the protection of end-to-end encryption if applicable to the service
Amendment 153 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 1 Amendment 154 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 Amendment 155 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 – functionalities enabling
Amendment 156 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 – functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily accessible and age-appropriate and that respect users’ privacy;
Amendment 157 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - the integration of tools such as safety assistants to prevent child sexual abuse online;
Amendment 158 #
Proposal for a regulation Article 3 – paragraph 2 – point c Amendment 159 #
Proposal for a regulation Article 3 – paragraph 2 – point d (d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, the design of their recommender systems and any other relevant algorithmic system and the impact thereof on that risk;
Amendment 160 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point i Amendment 161 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii Amendment 162 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii (ii) where the service is used by children, the different age groups of the child users and the risk of solicitation of children in relation to those age groups, as well as the risk of adults using the service for the purpose of solicitation of children;
Amendment 163 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 1 – enabling users to publicly search for other
Amendment 164 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 – enabling users to
Amendment 165 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 Amendment 166 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii a (new) Amendment 167 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 The provider may request the EU Centre to perform an analysis of
Amendment 168 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise
Amendment 169 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 3 Amendment 170 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – point a (a) for a service which is subject to a
Amendment 171 #
Proposal for a regulation Article 3 – paragraph 6 6. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments, trends reported by authorities, civil society organisations and victim support organisations, and to the manners in which the services covered by those provisions are offered and used.
Amendment 172 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of
Amendment 173 #
Proposal for a regulation Article 4 – paragraph 1 – point a (a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service
Amendment 174 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (aa) providing technical measures and tools that allow users, and in particular children, to manage their own privacy, visibility, reachability and safety , and that are set to the most secure levels by default;
Amendment 175 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (ab) new informing users, keeping in mind children’s needs, about external resources and services in the user’s region on preventing child sexual abuse, counselling by help-lines, information on victim support and educational resources provided by hotlines and child protection organisations;
Amendment 176 #
Proposal for a regulation Article 4 – paragraph 1 – point a c (new) (ac) New providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line
Amendment 177 #
Proposal for a regulation Article 4 – paragraph 1 – point a d (new) (ad) automatic mechanisms and interface design elements to inform users about external preventive intervention programmes without prejudice to the prohibition of profiling under Article 22 GDPR and the processing of sensitive data under Article 9 GDPR
Amendment 178 #
Proposal for a regulation Article 4 – paragraph 1 – point b (b)
Amendment 179 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations, hotlines or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
Amendment 180 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) Amendment 181 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (ca) without breaking, weakening, circumventing or otherwise undermining end-to-end encryption in the sense of people’s right to confidential communications;.
Amendment 182 #
Proposal for a regulation Article 4 – paragraph 2 – point a (a) effective and proportionate in mitigating the identified serious risk;
Amendment 183 #
Proposal for a regulation Article 4 – paragraph 2 – point a a (new) (aa) new subject to an implementation plan with clear objectives and methodologies for identifying and quantifying impacts on the identified serious risk and on the exercise of the fundamental rights of all affected parties. The implementation plan shall be reviewed every six months.
Amendment 184 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk, specific vulnerabilities of children online and offline including age, gender and disability, as well as the provider’s financial and technological capabilities and the number of users;
Amendment 185 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk , any impact on the functionality of the service as well as the provider’s financial and
Amendment 186 #
Proposal for a regulation Article 4 – paragraph 2 – point c (c) applied in a diligent and non- discriminatory manner,
Amendment 187 #
Proposal for a regulation Article 4 – paragraph 2 – point d a (new) (da) only introduced following an assessment of the risks the mitigating measures themselves pose for users, in particular if these risks would disproportionately negatively affect persons on the basis of sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender or sexual orientation;
Amendment 188 #
Proposal for a regulation Article 4 – paragraph 2 – point d b (new) (db) developed in cooperation with children who use the service;
Amendment 189 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 190 #
Proposal for a regulation Article 4 – paragraph 3 a (new) 3a. Any requirement to take specific measures shall be without prejudice to Article 8 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] and shall entail neither a general obligation for hosting services providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.
Amendment 191 #
Proposal for a regulation Article 4 – paragraph 4 4.
Amendment 192 #
Proposal for a regulation Article 4 – paragraph 4 a (new) 4a. Specific measures for platforms primarily used for the dissemination of pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure a. user-friendly reporting mechanisms to report alleged child sexual abuse material; b. adequate professional human content moderation to rapidly process notices of alleged child sexual abuse material; c. automatic mechanisms and interface design elements to inform users about external preventive intervention programmes in the user’s region.
Amendment 193 #
Proposal for a regulation Article 4 – paragraph 4 b (new) Amendment 194 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online and in the manners in which the services covered by those provisions are offered and used.
Amendment 195 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation
Amendment 196 #
Proposal for a regulation Article 4 – paragraph 5 a (new) 5a. To complement the risk mitigation measures taken by the providers, gender- sensitive and child-friendly education and prevention measures shall be implemented.
Amendment 197 #
Proposal for a regulation Article 6 Amendment 198 #
Proposal for a regulation Article 6 – paragraph 1 – point a (a) make reasonable efforts to
Amendment 199 #
Proposal for a regulation Article 6 – paragraph 1 – point b Amendment 200 #
Proposal for a regulation Article 6 – paragraph 1 – point b (b) take reasonable measures to prevent child users from accessing the software applications not intended for their use or adapted to their safety needs in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;
Amendment 201 #
Proposal for a regulation Article 6 – paragraph 1 – point c Amendment 202 #
Proposal for a regulation Article 6 – paragraph 1 a (new) 1a. Security of communications and services Nothing in this regulation shall be construed as requiring or encouraging the prohibition, restriction, circumvention or undermining of the provision or the use of encrypted services.
Amendment 203 #
Proposal for a regulation Article 6 – paragraph 2 Amendment 204 #
Proposal for a regulation Article 6 – paragraph 3 Amendment 205 #
Proposal for a regulation Article 6 – paragraph 4 Amendment 206 #
Proposal for a regulation Article 6 – paragraph 4 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online, and to the manners in which the services covered by those provisions are offered and used.
Amendment 207 #
Proposal for a regulation Article 6 a (new) Article6a Obligations concerning age verification and for software application stores 1. Providers of software application stores considered as gatekeepers under the Digital Markets Act (EU) 2022/1925 shall: (a) indicate if applications contain features that could pose a risk to children; (b) indicate if measures have been taken to mitigate risks for children, and which measures have been taken; (c) provide guidance for parents on how to discuss risks with their children; (d) provide application developers with an open-source software library that enables age verification requests from inside applications both to European Digital Identity Wallets and third-party services; (e) provide, free of charge, an age- verification service that can respond to age verification requests from inside applications. 2. Providers of European Digital Identity Wallets under the Regulation (EU) No XXX/2023 establishing a framework for a European Digital Identity shall ensure European Digital Identity Wallets can respond to age verification requests from applications without revealing the identity of the user. 3. Third-party age verification services used to fulfil the obligations of this article shall: (a) only retain user personal data for the purpose of fulfilling future requests, and with the explicit consent of the user; (b) Only retain data vital to process future verification request, namely: i. a pseudonymous means of authenticating the user; and ii. the users previously verified date of birth. (c) only use this data of the purpose of age verification; (d) fulfil requests for the deletion of this data pursuant to the GDPR; 4. Where Developers of applications have identified a significant risk of use of the service concerned for the purpose of the solicitation of children, they shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to put in place safeguards, namely: (a) take reasonable measures to mitigate the risk, such as adapting the services to children, integrating a safety assistant or modifying or adding safeguards limiting access to certain features; (b) provide children with guidance on risks that will help them identify dangers and make more informed decisions; (c) where the application is manifestly unsuitable for children and cannot be adapted, prevent access. 5. Age verification mechanisms set out in this article shall not be used for the purposes of enabling or facilitating parental control technologies that give access to children’s private communications without their consent.
Amendment 208 #
Proposal for a regulation Chapter II – Section 2 – title 2
Amendment 210 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent independent judicial authority of the Member State that designated it
Amendment 211 #
Proposal for a regulation Article 7 – paragraph 1 a (new) 1a. The Coordinating Authority of establishment shall choose one of the following types of detection order: (a). proactive detection orders, which detect and report known child sexual abuse material under the measures specified in Article 10; (b). preventative detection orders, which detect solicitation and attempts by children to share self-generated abuse material, and assist them in avoiding risks, under the measures specified in Article 10;
Amendment 212 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The Coordinating Authority of establishment shall, before requesting the issuance of a
Amendment 213 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 2 To that end, it may, where appropriate,
Amendment 214 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point a (a) establish a draft request for the issuance of a
Amendment 215 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point c Amendment 216 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – introductory part Where, having regard to the
Amendment 217 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the incident that the authority intends to investigate, the measures it envisages taking to execute the intended
Amendment 218 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b)
Amendment 219 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b) where the draft implementation plan concerns an intended detection order concerning the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct
Amendment 220 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the
Amendment 221 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessment and in order to
Amendment 222 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point d (d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted
Amendment 223 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the implementation plan of the provider and taking utmost account of the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the validation and issuance of the
Amendment 224 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part Amendment 225 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is evidence of a s
Amendment 226 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b (b) the reasons for issuing the
Amendment 227 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 a (new) (c) nothing in the investigation order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider as a whole.
Amendment 228 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 Amendment 229 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point a Amendment 230 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point b Amendment 231 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point c Amendment 232 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point d Amendment 233 #
Proposal for a regulation Article 7 – paragraph 5 – introductory part 5. As regards
Amendment 234 #
Proposal for a regulation Article 7 – paragraph 5 – point a (a) it is likely, despite any mitigation measures that the provider may have taken or will take, that the service is
Amendment 235 #
Proposal for a regulation Article 7 – paragraph 5 – point b (b) there is evidence of the service
Amendment 236 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 237 #
Proposal for a regulation Article 7 – paragraph 6 – introductory part 6. As regards
Amendment 238 #
Proposal for a regulation Article 7 – paragraph 6 – point a (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used,
Amendment 239 #
Proposal for a regulation Article 7 – paragraph 6 – point b (b) there is evidence of the service,
Amendment 240 #
Proposal for a regulation Article 7 – paragraph 6 – point c – point 1 (1) a
Amendment 241 #
Amendment 242 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – introductory part As regards preventative detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met:
Amendment 243 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – introductory part As regards
Amendment 244 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – point a (a) the provider qualifies as a provider of publicly available number-independent interpersonal communication services;
Amendment 245 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – point b (b) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used,
Amendment 246 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – point c (c) there is evidence of the service,
Amendment 247 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 2 Amendment 248 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the judicial validation and the issuance of
Amendment 249 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that
Amendment 250 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point a (a) where th
Amendment 251 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point b (b) where necessary, in particular to limit such negative consequences, effective and proportionate safeguards additional to those listed in Article 10(4), (5)
Amendment 252 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 a (new) (d) nothing in the investigation order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider.
Amendment 253 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 254 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 2 The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the
Amendment 255 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 The period of application of
Amendment 256 #
Proposal for a regulation Article 8 – title 8 Additional rules regarding
Amendment 257 #
Proposal for a regulation Article 8 – title 8 Additional rules regarding
Amendment 258 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 259 #
Proposal for a regulation Article 8 – paragraph 1 – point a (a) information regarding the measures to be taken to execute the
Amendment 260 #
Proposal for a regulation Article 8 – paragraph 1 – point b (b) identification details of the competent judicial authority
Amendment 261 #
Proposal for a regulation Article 8 – paragraph 1 – point d (d) the specific service in respect of which the
Amendment 262 #
Proposal for a regulation Article 8 – paragraph 1 – point d a (new) (da) the type of detection order;
Amendment 263 #
Proposal for a regulation Article 8 – paragraph 1 – point e (e) whether the
Amendment 264 #
Proposal for a regulation Article 8 – paragraph 1 – point f (f) the start date and the end date of the
Amendment 265 #
Proposal for a regulation Article 8 – paragraph 1 – point g (g) a
Amendment 266 #
Proposal for a regulation Article 8 – paragraph 1 – point h (h) the factual and legal grounds justifying the issuing of the order, and a reference to this Regulation as the legal basis for the
Amendment 267 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 268 #
Proposal for a regulation Article 8 – paragraph 1 – point j (j) easily understandable information about the redress available to the addressee of the
Amendment 269 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 270 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 2 The
Amendment 271 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 3 The
Amendment 272 #
Proposal for a regulation Article 8 – paragraph 3 3. If the provider cannot execute the
Amendment 273 #
Proposal for a regulation Article 8 – paragraph 4 a (new) Amendment 274 #
Proposal for a regulation Article 8 – paragraph 4 b (new) 4b. Notification mechanism 1. Providers of hosting services and providers of interpersonal communication services shall establish mechanisms that allow users to notify to them the presence on their service of specific items or activities that the user considers to be potential child sexual abuse material, in particular previously unknown child sexual abuse material and solicitation of children. Those mechanisms shall be easy to access and user-friendly, child-friendly and shall allow for the submission of notices exclusively by electronic means. 2. Where the notice contains the electronic contact information of the user who submitted it , the provider shall without undue delay send a confirmation or receipt to the user. 3. Providers shall ensure that such notices are processed without undue delay.
Amendment 275 #
Proposal for a regulation Article 9 – title 9 Redress, information, reporting and modification of
Amendment 276 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services that have received a
Amendment 277 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the
Amendment 278 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 2 For the purpose of the first subparagraph, a
Amendment 279 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 1 Where the period of application of the
Amendment 280 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 2 Those reports shall include a detailed description of the measures taken to execute the
Amendment 281 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the
Amendment 282 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 283 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of interpersonal communication services that are not end-to end encrypted, and that have received a proactive detection order shall execute it by installing and operating technologies to detect the dissemination of known
Amendment 284 #
Proposal for a regulation Article 10 – paragraph 1 a (new) Amendment 285 #
Proposal for a regulation Article 10 – paragraph 1 b (new) 1b. 1b. Technologies used in preventative detection orders to detect grooming shall only report detection in cases where the potential victim, trusted adult, or moderator explicitly choose to. Where end-to-end encryption is used the detection should be done entirely on the users’ device.
Amendment 286 #
Proposal for a regulation Article 10 – paragraph 1 c (new) 1c. 1c. Technologies used in preventative detection orders to detect when children attempt to use their services to send intimate images shall not report these users in any way, Where end- to-end encryption is used the detection should be done entirely on the users’ device.
Amendment 287 #
Proposal for a regulation Article 10 – paragraph 1 d (new) 1d. 1d. The Coordinating Authority shall be empowered to request services take further preventative measures so long as those measures do not involve reporting, and only after approval by the relevant Data Protection Authority.
Amendment 288 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible, age-appropriate, gender-sensitive, and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
Amendment 289 #
Proposal for a regulation Article 11 – paragraph 1 The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online, and the manners in which the services covered by those provisions are offered and used.
Amendment 290 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of
Amendment 291 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 1 Where the provider submits a report pursuant to paragraph 1, it shall
Amendment 292 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 2 The provider shall
Amendment 293 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 3 Amendment 294 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible
Amendment 295 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible
Amendment 296 #
Proposal for a regulation Article 12 – paragraph 3 a (new) 3a. New possible child sexual abuse material reported by a user shall immediately be assessed to determine the probability that the material represent risk or harm to a child. If the potential online child sexual abuse on the service is flagged by a user known to be a child, the provider shall provide the child with essential information on online safety and specialist child support services, such as helplines and hotlines, in addition to the reporting of the material.
Amendment 297 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 298 #
Proposal for a regulation Article 13 – paragraph 1 – point c (c)
Amendment 299 #
Proposal for a regulation Article 13 – paragraph 1 – point d (d) a list of all available data other than content data related to the potential online child sexual abuse preserved in line with the preservation order in article 8a;
Amendment 300 #
Proposal for a regulation Article 13 – paragraph 1 – point d a (new) (da) a list of all traffic data and metadata retained in relation to the potential online child sexual abuse, which could be made available to law enforcement authorities, together with information concerning default storage periods.
Amendment 301 #
Proposal for a regulation Article 13 – paragraph 1 – point f Amendment 302 #
Proposal for a regulation Article 13 – paragraph 1 – point g Amendment 303 #
Proposal for a regulation Article 13 – paragraph 1 – point i (i) where the
Amendment 304 #
Proposal for a regulation Article 13 – paragraph 1 – point j (j) in indication whether the provider considers that the report requires urgent action;
Amendment 305 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of
Amendment 306 #
Proposal for a regulation Article 14 – paragraph 1 a (new) 1a. Before requesting a removal order, the Coordinating Authority of establishment and competent judicial authority shall take all reasonable steps to ensure that implementing the order will not interfere with activities for the investigation and prosecution of child sexual abuse offences.
|