Progress: Awaiting Parliament's position in 1st reading
Role | Committee | Rapporteur | Shadows |
---|---|---|---|
Lead | LIBE | ||
Former Responsible Committee | LIBE | ZARZALEJOS Javier ( EPP) | |
Committee Opinion | BUDG | ||
Committee Opinion | IMCO | ||
Committee Opinion | CULT | ||
Committee Opinion | FEMM | ||
Former Committee Opinion | IMCO | AGIUS SALIBA Alex ( S&D) | Adam BIELAN ( ECR), Kateřina KONEČNÁ ( GUE/NGL), Marcel KOLAJA ( Verts/ALE), Jean-Lin LACAPELLE ( ID), Marion WALSMANN ( PPE), Catharina RINZEMA ( RE) |
Former Committee Opinion | BUDG | HERBST Niclas ( EPP) | Nils TORVALDS ( RE), Olivier CHASTEL ( RE), Alexandra GEESE ( Verts/ALE), Silvia MODIG ( GUE/NGL), Bogdan RZOŃCA ( ECR), Nils UŠAKOVS ( S&D) |
Former Committee Opinion | CULT | KIZILYÜREK Niyazi ( GUE/NGL) | Asim ADEMOV ( PPE), Marcel KOLAJA ( Verts/ALE), Lucia ĎURIŠ NICHOLSONOVÁ ( RE), Catherine GRISET ( ID), Andrey SLABAKOV ( ECR), Marcos ROS SEMPERE ( S&D) |
Former Committee Opinion | FEMM | FRITZON Heléne ( S&D) | Jadwiga WIŚNIEWSKA ( ECR), Sandra PEREIRA ( GUE/NGL), Pierrette HERZBERGER-FOFANA ( Verts/ALE), Karen MELCHIOR ( RE), Eleni STAVROU ( PPE) |
Lead committee dossier:
Legal Basis:
RoP 57_o, TFEU 114
Legal Basis:
RoP 57_o, TFEU 114Subjects
Events
The Committee on Civil Liberties, Justice and Home Affairs adopted a report by Javier ZARZALEJOS (EPP, ES) on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse.
The committee responsible recommended that the European Parliament's position adopted at first reading under the ordinary legislative procedure should amend the proposal as follows:
Subject matter and scope
The proposed Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse, in order to contribute to the proper functioning of the internal market and to create a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter are effectively protected. It establishes, inter alia, obligations on providers of online games.
It should not apply to audio communications.
Detection obligations
Concerning detection orders and its consequent detection obligations, Members considered that they should cover not only the dissemination of material previously detected and confirmed as constituting child sexual abuse material (known material), but also material not previously detected that is likely to constitute child sexual abuse material but has not yet been confirmed as such (new material), as well as activities constituting the solicitation of children (grooming).
In the adopted text, Members excluded end-to-end encryption from the scope of the detection orders to guarantee that all users’ communications are secure and confidential. Providers would be able to choose which technologies to use as long as they comply with the strong safeguards foreseen in the law, and subject to an independent, public audit of these technologies.
In order to stress detection orders as a mechanism of last resort , Members proposed reinforcing prevention as part of the mitigation measures to be taken by relevant society communication services. Mitigation measures may include targeted measures to protect the rights of the child, including safety and security design for children by default, functionalities enabling age assurance and age scoring, age-appropriate parental control tools, allowing flagging and/or notifying mechanisms, self-reporting functionalities, or participating in codes of conduct for protecting children.
Detection orders should contain information about the right to appeal to a court of law according to the national legislation.
Reporting obligations
Providers of hosting services and providers of number-independent interpersonal communication services should establish and operate an easy to access, age-appropriate, child-friendly and user-friendly mechanism that allows any users or entity to flag or notify them of the presence on their service of specific items of information that the individual or entity considers to be potential online child sexual abuse, including self-generated material.
EU centre for child protection
Under the amended text, the European Union Agency to prevent and combat child sexual abuse, the EU Centre for child protection, is established. It should gather and share anonymised information, gender-, and age-disaggregated statistics, and expertise, educational materials and best practices and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online. It should promote and ensure the appropriate support and assistance to victims.
Victims’ Rights and Survivors Consultative Forum
Members proposed to create a Victim’s Rights and Survivors Consultative Forum to make sure that victims’ voices are heard.
Establishment of an online European Child Protection Platform
Members proposed that the EU Centre should create, maintain and operate an online platform for the presentation of information about Member States hotlines and helplines ('Child Protection Platform'). That platform may also be used for the promotion of awareness-raising and prevention campaigns. The platform should be accessible 24 hours a day and seven days a week in all Union languages and shall be child-friendly, age-appropriate and accessible.
Seat
The choice of the location of the seat of the EU Centre should be made in accordance with the ordinary legislative procedure, based on specific criteria. The Commission had initially proposed the Netherlands.
Review
Within three years from the entry into force of the Regulation, the Commission should submit a report to the European Parliament and to the Council on the necessity and feasibility of including the solicitation of children in the scope of the detection orders, taking into account in particular the reliability and accuracy of the state of art of the detection technologies. Where appropriate, the report should be accompanied by legislative proposals.
PURPOSE: to set out a clear and harmonised legal framework on preventing and combating child sexual abuse.
PROPOSED ACT: Regulation of the European Parliament and of the Council.
ROLE OF THE EUROPEAN PARLIAMENT: the European Parliament decides in accordance with the ordinary legislative procedure and on an equal footing with the Council.
BACKGROUND: information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well-being, as is required under the Charter of Fundamental Rights of the European Union, and to protect society at large.
In the absence of harmonised rules at EU level, social media platforms, gaming services, other hosting and online service providers face divergent rules. Certain providers voluntarily use technology to detect, report and remove child sexual abuse material on their services. Measures taken, however, vary widely and voluntary action has proven insufficient to address the issue.
The protection of children, both offline and online, is a Union priority.
CONTENT: in order to address the abovementioned challenges, the Commission proposed to establish a clear and harmonised legal framework on preventing and combating online child sexual abuse . It seeks to provide legal certainty to providers as to their responsibilities to assess and mitigate risks and, where necessary, to detect, report and remove such abuse on their services in a manner consistent with the fundamental rights laid down in the Charter and as general principles of EU law.
This proposal therefore lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market. It establishes, in particular:
An EU Centre
The proposal seeks to establish the EU Centre on Child Sexual Abuse (EUCSA) as a decentralised agency to enable the implementation of the new Regulation. It aims to help remove obstacles to the internal market, especially in connection to the obligations of providers under this Regulation to detect online child sexual abuse, report it and remove child sexual abuse material. The Centre will create, maintain and operate databases of indicators of online child sexual abuse that providers will be required to use to comply with the detection obligations. These databases should therefore be ready before the Regulation enters into application. To ensure that, the Commission has already made funding available to Member States to help with the preparations of these databases.
Mandatory risk assessment and risk mitigation measures
Providers of hosting or interpersonal communication services will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.
Targeted detection obligations, based on a detection order
Member States will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new child sexual abuse material or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.
Strong safeguards on detection
Companies having received a detection order will only be able to detect content using indicators of child sexual abuse verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.
Clear reporting obligations
The proposal obliges providers that have detected online child sexual abuse to report it to the EU Centre.
Effective removal
National authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.
Reducing exposure to grooming
The rules require software application stores to ensure that children cannot download applications that may expose them to a high risk of solicitation of children.
Solid oversight mechanisms and judicial redress
Detection orders will be issued by courts or independent national authorities. To minimise the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.
Documents
- Committee report tabled for plenary, 1st reading: A9-0364/2023
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Document attached to the procedure: EUR-Lex
- Document attached to the procedure: SEC(2022)0209
- Document attached to the procedure: EUR-Lex
- Document attached to the procedure: SWD(2022)0209
- Document attached to the procedure: EUR-Lex
- Document attached to the procedure: SWD(2022)0210
- Legislative proposal published: COM(2022)0209
- Legislative proposal published: EUR-Lex
- Document attached to the procedure: EUR-Lex SEC(2022)0209
- Document attached to the procedure: EUR-Lex SWD(2022)0209
- Document attached to the procedure: EUR-Lex SWD(2022)0210
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
- Contribution: COM(2022)0209
Amendments | Dossier |
2798 |
2022/0155(COD)
2022/11/30
CULT
124 amendments...
Amendment 100 #
Proposal for a regulation Article 6 a (new) Article 6 a Anonymous public reporting of online child sexual abuse 1. Member States shall take appropriate measures to promote and safeguard the role of formally recognized non- governmental organizations involved in anonymous public reporting of child sexual abuse material and the proactive search for such material. 2. Member States shall ensure that the public always has the possibility to anonymously report child sexual abuse material and child sexual exploitation activities to hotlines specialised in combatting online child sexual abuse material and shall safeguard the role of such hotlines in anonymous public reporting. 3. Member States shall ensure that the hotlines referred to in paragraph 2 operating in their territory are authorised to view, assess and process anonymous reports of child sexual abuse material. 4. Member States shall grant the hotlines referred to in paragraph 2 the authority to issue content removal notices for confirmed instances of child sexual abuse material. 5. Member States shall authorise the hotlines referred to in paragraph 2 to voluntarily conduct pro-active searching for child sexual abuse material online.
Amendment 101 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation or by the report submitted by the recognised hotline, which results in its voluntary and timely removal, of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
Amendment 102 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services, hotlines and organisations acting solely in the public interest against child sexual abuse shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements.
Amendment 103 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements, with the exception of subsequent non-cooperation with the judicial authorities.
Amendment 104 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they reside, information regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12.
Amendment 105 #
Proposal for a regulation Article 21 – title Victims’ right of assistance and support
Amendment 106 #
Proposal for a regulation Article 21 – paragraph 1 1. The providers of very large online platforms that have identified the risk of use of their service for the purpose of online child sexual abuse in line with Article 3 shall provide reasonable assistance, on request, to persons residing in the Union that seek to report potential abuse, by putting in place reporting functions in a prominent way on their platform. Such providers shall ensure adequate follow-up, when a report or alert is made, in the language that the user has chosen for their service. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 107 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider complemented in a timely matter and, if requested and appropriate, also included in the list of indicators used to prevent the further dissemination of these items.
Amendment 108 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, in a timely manner, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 109 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 2 In this regard, a special green line with a call centre assistance service will be established, in order for victims and their families to receive support in a timely manner. That Coordinating Authority shall transmit the request to the EU Centre through the system established in accordance with Article 39(2) and shall communicate the results received from the EU Centre to the person making the request.
Amendment 110 #
Proposal for a regulation Article 21 – paragraph 4 a (new) 4 a. Member States shall establish and improve the functioning of child helpline and missing children hotline, including through funding and capacity building, in line with Article 96 of Directive (EU) 2018/1972.
Amendment 111 #
4 b. Member States shall ensure that law enforcement authorities have adequate technical, financial and human resources to carry out their tasks, including for the purpose of identification of victims.
Amendment 112 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 2 The Coordinating Authority shall be responsible for all matters related to application and enforcement of this Regulation in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities. The Coordinating Authority shall also be responsible for the coordination and adaptation of prevention techniques, elaborated by the EU Centre. The Coordinating Authority shall generate recommendations and good practices on improving digital literacy and skills amongst the population trough the realization of awareness campaigns on a national level, targeting in particular parents and children on the detection and prevention of child sexual abuse online.
Amendment 113 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 2 The Coordinating Authority shall be responsible for all matters related to the application and enforcement of this Regulation, and to the achievement of the objective of this regulation in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities.
Amendment 114 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 3 The Coordinating Authority shall in any event be responsible for ensuring coordination at national level in respect of those matters, including matters related to prevention and for contributing to the effective, efficient and consistent application and enforcement of this Regulation throughout the Union.
Amendment 115 #
Proposal for a regulation Article 25 – paragraph 3 3.
Amendment 116 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available
Amendment 117 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to coordinate prevention within the Member State and to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and communicate it to the EU Centre. They shall keep that information updated.
Amendment 118 #
Proposal for a regulation Article 25 – paragraph 7 – point d a (new) (d a) provide knowledge and experience on appropriate prevention techniques on grooming and the detection and dissemination of CSAM online;
Amendment 119 #
Proposal for a regulation Article 25 a (new) Article 25 a Cooperation with partner organisations Where necessary for the performance of its tasks under this Regulation, including the achievement of the objective of this Regulation, and in order to promote the generation and sharing of knowledge in line with Article 43 (6), the Coordinating Authority shall cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations and semi-public organisations and practitioners.
Amendment 120 #
Proposal for a regulation Article 26 – paragraph 2 – point c (c) are free from any undue external influence, whether direct or indirect; it being understood that the membership of the Coordinating Authority in a recognised international network shall not prejudice its independent character;
Amendment 121 #
Proposal for a regulation Article 26 – paragraph 4 4. The Coordinating Authorities shall ensure that
Amendment 122 #
Proposal for a regulation Article 34 – paragraph 2 2. Coordinating Authorities shall also provide child
Amendment 123 #
Proposal for a regulation Article 39 – paragraph 1 1. Coordinating Authorities shall cooperate with each other, with national hotlines and any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and other relevant Union agencies, including Europol, to facilitate the performance of their respective tasks under this Regulation and ensure its effective, efficient and consistent application and enforcement. Coordinating Authorities shall exchange information and best practices on preventing and combatting grooming and child sexual abuse online.
Amendment 124 #
Proposal for a regulation Article 40 – paragraph 2 2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information, educational materials and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
Amendment 125 #
Proposal for a regulation Article 40 – paragraph 2 2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information, good practices and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
Amendment 126 #
Proposal for a regulation Article 40 – paragraph 2 a (new) 2 a. The EU Centre shall elaborate appropriate prevention techniques on grooming and child sexual abuse online, based on its knowledge, expertise and achievements, in close cooperation with relevant stakeholders and in line with the Communication of the Commission of 11 May “A Digital Decade for children and youth: the new European strategy for a better internet for kids" (BIK+).
Amendment 127 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – introductory part (6) facilitate the generation and sharing of knowledge with other Union institutions, bodies, offices and agencies, organisations acting in the public interest against child sexual abuse and hotlines, Coordinating Authorities or other relevant authorities of the Member States to contribute to the achievement of the objective of this Regulation, by:
Amendment 128 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a (a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51, including in view of updating guidelines on prevention and mitigation methods for combatting child sexual abuse, especially for the digital dimension as per new technological developments;
Amendment 129 #
(a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51 , including education and awareness raising programmes, and intervention programmes;
Amendment 130 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a a (new) (a a) gathering information about awareness and prevention campaigns carried out in the different Member States, as well as good practices carried out by public and private bodies, stakeholders and education systems and centres;
Amendment 131 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research and expertise on
Amendment 132 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research, educational materials and expertise on those matters and on assistance to victims, including by serving as a hub of expertise to support evidence-based policy;
Amendment 133 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (b a) contribute to the implementation of awareness campaigns as per the potential risks posed by the online environment to children, in order to equip them with adequate skills for detecting potential grooming and deceit, to ensure safe use of the internet by children and to better implement the prevention component of online child sexual abuse;
Amendment 134 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (b a) promoting age-differentiated awareness-raising campaigns in schools and information campaigns for parents, teachers and pupils;
Amendment 135 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) (b b) assisting with expertise and knowledge in the development and implementation of teacher training across the Union, in order to equip teachers with the necessary skills for guiding children on safely using information society services and detecting potentially malicious behaviour online;
Amendment 136 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) Amendment 137 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 a (new) (6 a) supporting and promoting the regular exchange of best practices and lessons learned among Member States on raising awareness for the prevention of child sexual abuse, prevention programmes and non-formal and formal education on the risks of sexual abuse in the digital environment;
Amendment 138 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 b (new) (6 b) provide assistance with training on prevention of child sexual abuse online for officials from Member States;
Amendment 139 #
8 a. Where the EU Centre receives a report from a hotline, or where a provider that submitted the report to the EU Centre has indicated that the report is based on the information received from a hotline, the EU Centre shall refrain from forwarding the report to the competent law enforcement authority or authorities to avoid duplicated reporting on the same material that has already been reported to the national law enforcement by the hotlines, and shall monitor the removal of the child sexual abuse material or cooperate with the relevant hotline to track the status.
Amendment 140 #
Proposal for a regulation Article 50 – paragraph 3 3. Where necessary for the performance of its tasks under this Regulation, the EU Centre shall carry out, participate in or encourage research, surveys and studies, either on its own initiative or, where appropriate and compatible with its priorities and its annual work programme, at the request of the European Parliament, the Council or the Commission. The collected knowledge (resulting from research, surveys and studies) shall serve as a tool to elaborate prevention techniques on child sexual abuse online to be adapted and implemented by Coordinating Authorities in each Member State.
Amendment 141 #
Proposal for a regulation Article 50 – paragraph 3 3.
Amendment 142 #
Proposal for a regulation Article 50 – paragraph 4 4. The EU Centre shall provide the information referred to in paragraph 2 and the information resulting from the research, surveys and studies referred to in paragraph 3, including its analysis thereof, and its opinions on matters related to the prevention and combating of online child sexual abuse to other Union institutions, bodies, offices and agencies, Coordinating Authorities, Hotlines, other competent authorities
Amendment 143 #
Proposal for a regulation Article 50 – paragraph 5 5. The EU Centre shall develop prevention techniques on the detection of suspicious content and behavior online and shall communicate it to Coordinating Authorities of each Member State, so they could adapt and initiate measures to improve digital literacy and raise awareness amongst parents and educators of the existing digital tools to insure a safe digital environment for children. The EU Centre shall also establish a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness
Amendment 144 #
Proposal for a regulation Article 50 – paragraph 5 a (new) 5 a. The EU Centre should develop ambitious campaigns tailored for all age ranges, taking into account that they should reach out to young children, adolescents, parents, teachers and society at large. They should also take into account people with disabilities, who may be more vulnerable as they may not have full access to this information.
Amendment 145 #
Proposal for a regulation Article 54 – paragraph 1 1. Where necessary for the performance of its tasks under this Regulation, the EU Centre
Amendment 146 #
Proposal for a regulation Article 54 – paragraph 2 2. The EU Centre may conclude
Amendment 147 #
Proposal for a regulation Article 83 a (new) Article 83 a Data collection on prevention programmes Member States shall report on the anticipated number of children in primary education who have been informed through the awareness campaigns and through the education programmes about the risks of all forms of sexual exploitation of children, including in the online environment.
Amendment 148 #
Proposal for a regulation Article 85 – paragraph 1 1. By [
Amendment 149 #
Proposal for a regulation Article 85 – paragraph 2 2. By [
Amendment 26 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Digital services have become an irreplaceable tool for today’s children, as information, elements of formal education, social contact and entertainment are increasingly online; whereas digital services can also expose children to risks such as unsuitable content, grooming, and child sexual abuse. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), and to protect society at large. Users of such services offered in the Union should be able to trust that the services concerned can be used safely, especially by children. In order to ensure a safer online experience for children and prevent the above-mentioned offences, digital literacy should be recognized as a mandatory skill by Member States and should be included in the school curriculum across the EU.
Amendment 27 #
(1 a) The role of prevention should be emphasised by equipping children, parents and caregivers with the necessary instruments in order to develop situational awareness of the online environment, evaluate potential risks and support children in being safe online. In this regard, education facilities should have a greater role in contributing to this scope, reason for which civic education classes should also provide for the attainment of safe internet skills for children.
Amendment 28 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by appropriate prevention techniques, improving digital literacy, and ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to
Amendment 29 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services. To this end, fundamental importance should be attached to ensuring the necessary funding to European programmes and projects which aim to improve digital skills and awareness of risk linked to the digital world, such as “Media literacy for all”.
Amendment 30 #
Proposal for a regulation Recital 2 a (new) (2 a) For the purposes of this regulation, “digital skills” should be understood as skills relating to the web as a whole, consisting of both easily accessible surface web platforms and platforms accessible through the deep and dark web. The EU must therefore provide for effective awareness of the dangers also lurking in the deep and dark web.
Amendment 31 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, have a direct negative effect on the internal market and lead to a fragmentation in the Union’s approach towards this phenomenon. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market,
Amendment 32 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements and appropriate prevention techniques should be laid down at Union level.
Amendment 33 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should
Amendment 34 #
Proposal for a regulation Recital 4 a (new) (4 a) To insure full application of the objectives of this Regulation, Member States shall implement prevention strategies and awareness campaigns in their school curriculum and inside educational institutions. Taking into account the data collected by the EU Centre, Coordinating Authorities, relevant law enforcement agencies and existing hotlines across the EU, Member States should elaborate prevention techniques improving digital literacy, by educating children on how to safely surf online and how to recognize signals of cyber grooming. Prevention techniques and awareness campaigns should also target parents. Parents and caregivers shall be informed of the existence and the functioning of digital tools to limit and direct their child’s/children’s experience online and limit access to age- inappropriate or harmful content online.
Amendment 35 #
Proposal for a regulation Recital 6 (6) Online child sexual abuse frequently involves the misuse of information society services offered in the Union by providers established in third countries.
Amendment 36 #
Proposal for a regulation Recital 11 (11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards
Amendment 37 #
Proposal for a regulation Recital 11 a (new) (11 a) The UN Study on Violence against Children defines "child sexual abuse" as any type of sexual activity inflicted on children, especially by someone who is responsible for them, or who has power or control over them, and whom they should be able to trust. Sexual violence against children encompasses a wide range of acts, such as forced sexual intercourse in intimate partner relationships, rape by strangers, systematic rape, sexual harassment (including demanding sex in exchange for compensation of any kind), sexual abuse of children, child marriage and violent acts against the sexual integrity of women, including female genital mutilation and compulsory virginity inspections.
Amendment 38 #
Proposal for a regulation Recital 11 b (new) (11 b) UNICEF defines child sexual abuse as when a child is used for the sexual stimulation of the perpetrator or the gratification of an observer. It involves any interaction in which consent does not exist or cannot be given, regardless of whether the child understands the sexual nature of the activity and even when the child shows no signs of refusal.
Amendment 39 #
Proposal for a regulation Recital 12 (12) For reasons of consistency and technological neutrality, the term ‘child sexual abuse material’ should for the purpose of this Regulation be defined as referring to any type of material constituting child pornography or pornographic performance within the meaning of Directive 2011/93/EU, which is capable of being disseminated through the use of hosting or interpersonal communication services. At present, such material typically consists of images or videos, without it however being excluded that it takes other forms, especially in view of future technological developments. Close attention should be paid to the development of new technologies and platforms, such as the metaverse. In such platforms child sexual abuse material might be generated and exchanged or child sexual abuse perpetrated through the use of avatars or any other form of virtual identities.
Amendment 40 #
Proposal for a regulation Recital 13 a (new) (13 a) The term "online grooming" refers to the process by which an adult tries to manipulate a child in order to obtain sexual audiovisual material or to have some kind of in-person sexual relationship with the child. According to international studies to date, between 5% and 15% of minors have been sexually solicited by adults through ICTs. Within the prevention measures, we must consider the responsible use of ICTs as a fundamental part of awareness-raising and education, where it is crucial to raise awareness of the implications of online consent to the use and dissemination of personal data, images or other information.
Amendment 41 #
Proposal for a regulation Recital 13 b (new) (13 b) In order to minimise the risks of online child content made available by legal guardians being used for ‘grooming’ as ‘new’ child sexual abuse material, media and digital literacy programmes should be put in place to make citizens aware of their responsibility as content disseminators. In this sense, ‘digital literacy’ refers to skills, knowledge and understanding that allows users to gain awareness on the potential risks associated with the child content they generate, produce and share, in the context of the child’s fundamental rights, and the obligations set out in this Regulation and in other Union data related Regulations. Consequently, the Union and its Member States should allocate more investments in education and training to spread digital literacy, and ensure that progress in that regard is closely followed.
Amendment 42 #
Proposal for a regulation Recital 17 a (new) (17 a) Member States continue to struggle with putting in place effective prevention programmes to combat child sexual abuse as required in Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children and child pornography, where frequently multiple types of stakeholders need to take action. As a result, children and the persons in their environment are insufficiently aware of the risks of sexual abuse and of the means of limiting such risks, while the online dimension represents a particular challenge, with constant growing tendency. As education plays a key role in the prevention of child sexual abuse, Member States should inform the public, by all means necessary, about the dangers and risks of sexual abuse for young people in the digital world, including by ensuring a close cooperation at European and international level and by strengthening work with organised civil society, in particular with schools and law enforcement representatives. Member States should take appropriate means to include programmes to this effect in the early education curricula.
Amendment 43 #
Proposal for a regulation Recital 18 a (new) (18 a) Basic digital skills, including cyber hygiene, cyber safety, data protection and media literacy are essential for children and young people, as they enable them to make informed decisions, assess and overcome the risks associated with the internet. Therefore, it is important to strengthen media literacy efforts in Member States and at the Union level, through dedicated media literacy education, publicly available relevant materials adapted for different age groups and information campaigns for children and their guardians.
Amendment 44 #
Proposal for a regulation Recital 22 (22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of
Amendment 45 #
Proposal for a regulation Recital 24 (24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, should be in a position to take a well-informed decision on requests for the issuance of detections orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especially in connection to detection orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question. They should do so as soon as possible, having regard to the important public policy objective at stake and the need to act without undue delay to protect children, in view of the seriousness of the impact that such offences have on the physical and mental health of minors and in view of the difficulty of curbing the dissemination of material online. In particular, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for
Amendment 46 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims
Amendment 47 #
Proposal for a regulation Recital 35 a (new) Amendment 48 #
Proposal for a regulation Recital 36 (36) In order to prevent children from falling victim of abuse, providers of very large online platforms which have identified the risk of use of their service for the purpose of online child sexual abuse in line with Article 3 should provide reasonable assistance, by putting in place alert and alarm mechanisms in a prominent way on their platforms. The alert mechanism could consist of, for example, linking potential victims to the local organisations such as helplines, victims` right organisations or hotlines. Providers of very large online platforms should ensure adequate follow-up, when a report or alert is made, in the language chosen by the user for using their service. Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to
Amendment 49 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to receive free immediate psychological support or support of any other professionals and to be assisted by the EU Centre in this regard, via the Coordinating Authorities.
Amendment 50 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist
Amendment 51 #
Proposal for a regulation Recital 37 (37) To ensure the efficient management of such victim support functions, victims should be well informed about the existence of such centres and be allowed to contact and rely on the Coordinating Authority that is most accessible to them, which should channel all communications between victims and the EU Centre.
Amendment 52 #
Proposal for a regulation Recital 38 (38) For the purpose of facilitating the exercise of the victims’ right to information and of assistance and support for fast removal or disabling of access,
Amendment 53 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For
Amendment 54 #
Proposal for a regulation Recital 45 a (new) (45 a) Given the EU Centre’s particular expertise with regard to the generation and sharing of knowledge, Member States should ensure that such information is shared and promoted at national level. For this purpose, they should cooperate with partner organisations, including with semi-public organisations and hotlines, as well as with civil society. It is important to ensure that practitioners who get in close contact with child victims are adequately trained to deal with such victims, and that the situation of the victim is adequately mitigated. Therefore, the Coordinating authority should ensure that officials such as law enforcement officers, judges, prosecutors, lawyers and forensic experts and social workers cooperate with civil society and semi-public organisations.
Amendment 55 #
Proposal for a regulation Recital 49 (49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, removal orders or blocking orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, and reacting timely to the evolving trends of child sexual abuse material dissemination and monetisation, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.
Amendment 56 #
Proposal for a regulation Recital 50 (50) With a view to ensuring that providers of hosting services are aware of
Amendment 57 #
Proposal for a regulation Recital 56 (56) With a view to ensuring that the indicators generated by the EU Centre for the purpose of detection are as complete as possible, the submission of relevant material and transcripts should be done proactively by the Coordinating Authorities. However, the EU Centre should also be allowed to bring certain material or conversations to the attention of the Coordinating Authorities for those purposes
Amendment 58 #
Proposal for a regulation Recital 57 a (new) (57 a) According to the UN, one of the main factors influencing the increase in child sexual abuse in developing countries is the decline in sex education. Studies have shown that if a child receives good sex education, it can equip them with the necessary tools to identify situations in which they may be sexually abused. Therefore, the education sector and education and awareness programmes play a key role in preventing child sexual abuse.
Amendment 59 #
Proposal for a regulation Recital 57 b (new) (57 b) Some studies point to depression and loneliness and a history of physical or psychological harassment as some of the characteristics of Internet-initiated victims of sexual crimes. Other studies distinguish two types of victims: risky victims and vulnerable victims. Vulnerable victims are defined as those with a high need for affection due to feelings of loneliness and low self-esteem. This shows that bullying and cyberbullying problems can lead to some children being prone to physical and online sexual abuse.
Amendment 60 #
Proposal for a regulation Recital 58 (58) In particular, in order to facilitate the cooperation needed for the proper functioning of the mechanisms set up by this Regulation, the EU Centre should establish and maintain the necessary information-sharing systems. When establishing and maintaining such systems, the EU Centre should cooperate with the European Union Agency for Law Enforcement Cooperation (‘Europol’), national hotlines and national authorities to build on existing systems and best practices, where relevant.
Amendment 61 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this
Amendment 62 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access
Amendment 63 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of
Amendment 64 #
Proposal for a regulation Recital 61 (61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities or when appropriate, by the organisations acting in the public interest against child sexual abuse, in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different child sexual abuse material (new material), or the solicitation of children, as applicable.
Amendment 65 #
Proposal for a regulation Recital 62 (62) For the system established by this Regulation to function properly, the EU Centre should be charged with creating databases for each of those three types of online child sexual abuse, and with maintaining, timely updating and operating those databases. For accountability purposes and to allow for corrections where needed, it should keep records of the submissions and the process used for the generation of the indicators.
Amendment 66 #
Proposal for a regulation Recital 62 (62) For the system established by this Regulation to function properly, the EU Centre should be charged with creating databases for each
Amendment 67 #
(65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks, reports should pass through the EU Centre. The EU Centre should immediately assess
Amendment 68 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of
Amendment 69 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse, including on the successful initiatives and good practices on the proactive search for online child sexual material, trends in its creation and monetisation, as well as the voluntary prevention, detection and mitigation of online child sexual abuse. In this connection, the EU Centre should cooperate on a regular basis with relevant stakeholders from both within and outside the Union, including law enforcement authorities with the relevant expertise, educators, civil society, service providers and industry representatives, and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned.
Amendment 70 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse
Amendment 71 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned. For this scope, the EU Centre can also aid in the implementation of awareness campaigns and contribute to the establishment and improvement of specific guidelines and proposals for mitigation measures respectively, so as to ensure accuracy and up to date solutions in tackling online child sexual abuse.
Amendment 72 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also
Amendment 73 #
Proposal for a regulation Recital 67 a (new) (67 a) In carrying out its mission, the EU Centre should also ensure transversal cooperation with education facilities, where appropriate, and digital education hubs, to also integrate this dimension of the prevention component, in order for children to become aware of the potential risks posed by the online environment.
Amendment 74 #
Proposal for a regulation Recital 67 b (new) (67 b) Considering the essential role teachers can play in guiding children on safely using information society services and detecting potentially malicious behaviour online, teacher training should be organized and implemented across the Union, in a coherent manner, benefitting from the knowledge and expertise of the EU Centre.
Amendment 75 #
Proposal for a regulation Recital 69 (69) In order to allow for the effective and efficient performance of its tasks, the EU Centre should closely cooperate with Coordinating Authorities, the Europol and relevant partner organisations, such as the US National Centre for Missing and Exploited Children or the International Association of Internet Hotlines (‘INHOPE’) network of hotlines for reporting child sexual abuse material, within the limits sets by this Regulation and other legal instruments regulating their respective activities. To facilitate and support such cooperation and build on the best practices and expertise acquired, the necessary arrangements should be made, including the designation of contact officers by Coordinating Authorities and the conclusion of memoranda of understanding with Europol and, where appropriate, with one or more of the relevant partner organisations located in the Union and outside the Union.
Amendment 76 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Child helplines are equally in the frontline in the fight against online child sexual abuse. Therefore, the EU Centre should also recognise the work of child helplines in victim response, and the existing referral mechanisms between child helplines and hotlines. The EU Centre should coordinate services for victims.
Amendment 77 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines and organisations which act in the public interest against child sexual abuse and which proactively search for child sexual abuse material or which do research and gather information on the trends in the dissemination and monetisation of child sexual abuse material, are in the frontline
Amendment 78 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across
Amendment 79 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Furthermore, a special green line with a call centre assistance service will be constituted at EU level in order for victims and their families to receive support in a timely manner.
Amendment 80 #
Proposal for a regulation Recital 70 a (new) (70 a) In line with Directive 2011/93/EU of the European Parliament and of the Council, this Regulation recognises and safeguards the key role of hotlines in order to enhance the fight against child sexual abuse online in the European Union. Hotlines have a track-record of proven capability since 1999 in the identification and removal of child sexual abuse material from the digital environment and have created a worldwide network and procedures for the child sexual abuse identification and removal. Member States should therefore promote and safeguard the role of formally recognized non-governmental organizations involved in anonymous public reporting of child sexual abuse material, which are at the forefront of detecting new child sexual abuse material, which is an essential factor in finding new victims while also keeping the databases of indicators up to date.
Amendment 81 #
Proposal for a regulation Recital 72 a (new) (72 a) In view of ensuring an adequate degree of expertise and skills for investigative purposes, specialized training of law enforcement officers will be introduced with the support of the EU Centre, especially considering rapid technological advancements where new methods, techniques and instruments require adapting preventive and mitigation efforts regarding online child sexual abuse.
Amendment 82 #
Proposal for a regulation Recital 73 (73) To ensure its proper functioning, the necessary rules should be laid down regarding the EU Centre’s organisation. In the interest of consistency, those rules should be in line with the Common Approach of the European Parliament, the Council and the Commission on decentralised agencies. In order to complete its tasks, the EU Centre and Coordinating authorities should have the necessary funds, human resources, investigative powers and technical capabilities to seriously and effectively pursue and investigate complaints and potential offenders, including appropriate training to build capacity in the judiciary and police units and to develop new high- tech capabilities to address the challenges of analysing vast amounts of child abuse imagery, including material hidden on the ‘dark web’.
Amendment 83 #
Proposal for a regulation Recital 74 (74) In view of the need for technical expertise in order to perform its tasks, in
Amendment 84 #
Proposal for a regulation Recital 74 (74) (74) In view of the essential need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies, including software, that can be used for fast detection, the EU Centre should have a Technology Committee composed of experts with advisory function, which should take into account Member States' experience and their achievements. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.
Amendment 85 #
Proposal for a regulation Recital 74 (74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to prevention and detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.
Amendment 86 #
Proposal for a regulation Recital 75 (75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal
Amendment 87 #
Proposal for a regulation Recital 76 (76) In the interest of good governance and drawing on the statistics and information gathered and transparency reporting mechanisms provided for in this Regulation, the Commission should carry out an evaluation of this Regulation within
Amendment 88 #
Proposal for a regulation Recital 76 (76) In the interest of good governance and drawing on the statistics and information gathered and transparency reporting mechanisms provided for in this Regulation, the Commission should carry out an evaluation of this Regulation within
Amendment 89 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point e a (new) (e a) Guidelines on creation of appropriate prevention techniques on cyber grooming and the dissemination of CSAM online, targeting children and parents and empowering them to use digital technologies safely and responsibly.
Amendment 90 #
Proposal for a regulation Article 2 – paragraph 1 – point k a (new) (k a) "child sexual abuse" means any actual or threatened physical intrusion, virtual or threatened intrusion of a sexual nature, for the sexual stimulation of the offender or an observer, made towards minors, whether by force or under unequal or coercive conditions;
Amendment 91 #
Proposal for a regulation Article 2 – paragraph 1 – point o a (new) (o a) "online grooming" is the process by which an adult attempts to manipulate via ICT a minor in order to obtain sexual audiovisual material or to engage in some form of face-to-face sexual relationship with that minor;
Amendment 92 #
Proposal for a regulation Article 2 – paragraph 1 – point p (p) ‘online child sexual abuse’ means the online dissemination of child sexual abuse material and the solicitation of children with the intention of violence/sexual abuse;
Amendment 93 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (w a) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous information from the public about alleged child sexual abuse material and online child sexual exploitation, which meets all the following criteria: (a) is officially recognised by its home Member State as expressed in the Directive 2011/93/EU of the European Parliament and of the Council; (b) has the mission of combatting child sexual abuse material in its articles of association; and (c) is part of a recognised and well-established international network of hotlines as referred to in this article.
Amendment 94 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (w a) `very large online platform` means online platforms which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online platforms pursuant to paragraph 4 of Article 33 of Regulation (EU) 2022/2065;
Amendment 95 #
Proposal for a regulation Chapter I a (new) Ia PREVENTION AND EDUCATION PROGRAMMES Article 2 a (new) 1. Member States shall take appropriate measures, such as education, awareness raising campaigns and training, to discourage and reduce the demand that fosters all forms of sexual exploitation of children in the online environment. 2. Member States shall take appropriate action, including through the Internet, such as information and awareness- raising campaigns, research and early- education programmes, where appropriate in cooperation with relevant civil society organisations acting in the public interest against child sexual abuse, law enforcement authorities and other stakeholders, aimed at raising awareness and reducing the risk of children becoming victims of sexual abuse or of exploitation online. 3. Member States shall promote regular training for officials likely to come into contact with child victims of sexual abuse or exploitation online, including the solicitation of children, aimed at enabling them to identify and deal with child victims and potential child victims. 4. Member States shall promote regular training for officials to inform them and update their knowledge on the latest trends in the creation, dissemination and monetization of child sexual abuse materials and national data hosting of child sexual abuse material.
Amendment 96 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 Prior to preparing its risk assessment, the provider shall be advised upon the specific requirements in order to ensure that the risk assessment is thorough, accurate and detailed. The provider may request the EU Centre to perform an analysis of representative, anonymized data samples to identify potential online child sexual abuse, to support the risk assessment.
Amendment 97 #
Proposal for a regulation Article 3 – paragraph 6 a (new) 6 a. The EU Centre should use these risk assessment reports to prepare and adapt prevention techniques to the attention of Coordinating Authorities across the EU.
Amendment 98 #
Proposal for a regulation Article 4 – paragraph 1 – point b a (new) (b a) to provide, through appropriate technical and operational measures, readily accessible and easy-to-use parental tools to help parents or guardians support children and identify harmful behaviour;
Amendment 99 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations, hotlines or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
source: 739.506
2023/03/09
IMCO
513 amendments...
Amendment 158 #
Proposal for a regulation Title 1 Amendment 159 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life,
Amendment 160 #
Proposal for a regulation Recital 1 a (new) (1 a) Regulatory measures to address the dissemination of child sexual abuse content online should be complemented by Member States strategies including increasing public awareness, how to seek child-friendly and age appropriate reporting and assistance and informing about victims rights. Additionally Member States should make sure they have a child-friendly justice system in place in order to avoid further victimisation of the abused children.
Amendment 161 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being
Amendment 162 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services for the digital single market, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, effective, carefully balanced and proportionate, so as to avoid any
Amendment 163 #
Proposal for a regulation Recital 3 (3)
Amendment 164 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, may have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
Amendment 165 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is demonstrably and durably effective and that respects the fundamental rights of all parties concerned. In view of the fast- changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future-
Amendment 166 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform
Amendment 167 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available number- independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those services a
Amendment 168 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this
Amendment 169 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services
Amendment 170 #
Proposal for a regulation Recital 6 (6) Online child sexual abuse
Amendment 171 #
Proposal for a regulation Recital 7 (7) This Regulation should be without prejudice to the rules resulting from other Union acts, in particular Directive 2011/93 of the European Parliament and of the Council38, Directive 2000/31/EC of the European Parliament and of the Council39and Regulation (EU)
Amendment 172 #
Proposal for a regulation Recital 8 (8) This Regulation should be considered lex specialis in relation to the generally applicable framework set out in Regulation (EU)
Amendment 173 #
Proposal for a regulation Recital 9 (9) Article 15(1) of Directive 2002/58/EC allows Member States to adopt legislative measures to restrict the scope of the rights and obligations provided for in
Amendment 174 #
Proposal for a regulation Recital 10 (10) In the interest of clarity and consistency, the definitions provided for in this Regulation should, where possible and appropriate, be based on and aligned with the relevant definitions contained in other acts of Union law, such as Regulation (EU)
Amendment 175 #
Proposal for a regulation Recital 11 (11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number , in relation to population size of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of a software application in the relevant national software application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1), point (c), of Regulation (EU) 1215/2012 of the European Parliament and of the Council44. Mere technical accessibility of a website from the Union
Amendment 176 #
Proposal for a regulation Recital 13 (13)
Amendment 177 #
Proposal for a regulation Recital 13 a (new) (13 a) Member States should ensure that they additionally address the problem of solicitation of children by providing for efficient digital education. Children should be given at home and in school the necessary digital skills and tools they need to fully benefit from online access, whilst ensuring their safety.
Amendment 178 #
Proposal for a regulation Recital 14 (14) With a view to minimising the risk that their services are misused for the dissemination of
Amendment 179 #
Proposal for a regulation Recital 15 (15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] with respect to information that they store and disseminate to the public , which should form the basis for the risk assessment under this instrument. For the purposes of the present Regulation, those providers may draw on such a risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as required by this Regulation.
Amendment 180 #
Proposal for a regulation Recital 15 (15) Some of those providers of relevant information society services in scope of this Regulation, including online search engines, may also be subject to an obligation to conduct a risk assessment under Regulation (EU)
Amendment 181 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers s
Amendment 182 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable specific measures to mitigate
Amendment 183 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU)
Amendment 184 #
Proposal for a regulation Recital 16 a (new) (16 a) To further prevent online child sexual abuse effectively, an emphasis should be placed on public awareness raising, including through easily understandable campaigns and in education with a focus on empowerment of young people to use the internet safely and to address societal factors that enable child sexual abuse, including harmful gender norms about women and girls and broader issues of societal inequality; In addition awareness raising should focus on hotlines where young people can report what has happened to them, as well as to improve access to institutional reporting by police and social services and other authorities.
Amendment 185 #
Proposal for a regulation Recital 16 a (new) (16 a) The used age assessing tools should be able to prove age in an efficient, privacy-preserving and secure manner.
Amendment 186 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in
Amendment 187 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory
Amendment 188 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement also voluntary measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers
Amendment 189 #
Proposal for a regulation Recital 17 a (new) (17 a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any restrictions of encryption could potentially be abused by malicious third parties. In order to ensure effective consumer trust, nothing in this Regulation should be interpreted as the requirement to prevent, circumvent, compromise, undermine encryption in place, or prohibit providers of information society services from providing their services applying encryption, restricting or undermining such encryption in the sense of being detrimental to users’ expectations of confidential and secure communication services, for example by implementation of client side scanning or other device- related, server-side solutions or requirements to proactively forward electronic communications to third parties which may weaken or introduce vulnerabilities into the encryption. Member States should not deter nor prevent providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third- party access.
Amendment 190 #
Proposal for a regulation Recital 17 a (new) (17 a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any restrictions of encryption could potentially be abused by malicious third parties, including deployment of software on end- user devices which can inspect private messages before encryption and transfer this information to third parties . In order to ensure effective consumer trust, nothing in this Regulation should be interpreted as prohibiting providers of information society services from providing their services applying encryption, restricting or undermining such encryption in the sense of being detrimental to users’ expectations of confidential and secure communication services. Member States should not prevent or discourage providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third party access.
Amendment 191 #
Proposal for a regulation Recital 17 a (new) (17 a) Relying on providers for risk mitigation measures comes with inherent problems, as business models, technologies and crimes evolve continuously. As a result, clear targets, oversight, review and adaptation, led by national supervisory authorities are needed, to avoid measures becoming redundant, disproportionate, ineffective, counterproductive and outdated.
Amendment 192 #
Proposal for a regulation Recital 17 a (new) (17 a) Fundamental rights in the digital sphere have to be guaranteed to the same extent as in the offline world. Safety and privacy need to be ensured, amongst others through end-to-end encryption in private online communication and the protection of private content against any kind of general or targeted surveillance, be it by public or private actors.
Amendment 193 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such
Amendment 194 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available interpersonal communications services should, when designing and implementing the mitigation measures, give importance
Amendment 195 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number-independent interpersonal communications services should, when designing and implementing the
Amendment 196 #
Proposal for a regulation Recital 19 (19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable measures
Amendment 197 #
Proposal for a regulation Recital 19 (19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take
Amendment 198 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when
Amendment 199 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be a last resort measure and subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that in particular solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to
Amendment 200 #
Proposal for a regulation Recital 20 a (new) (20 a) Having regard to the need to take due account of the fundamental rights guaranteed under the Charter of all parties concerned, any action taken by a provider of relevant information society services should be strictly targeted, in the sense that it should serve to detect, remove or disable access to the specific items of information considered to constitute child sexual abuse online, without unduly affecting the freedom of expression and of information of recipients of the service. Orders should therefore, as a general rule, be directed to the entity acting as a data controller or where that is unfeasible, to the specific provider of relevant information society services that has the technical and operational ability to act against such specific items of child sexual abuse material, so as to prevent and minimise any possible negative effects on the availability and accessibility of information that is not illegal content. The providers of relevant information society services who receive an order on the basis of which they cannot, for technical or operational reasons, remove the specific item of information, should inform the person or entity who submitted the order.
Amendment 201 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those limits and safeguards,
Amendment 202 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those limits and safeguards, detection orders should only be issued after a diligent and objective assessment leading to the finding of a significant risk of the specific service concerned being misused for a given type of online child sexual abuse covered by this Regulation. Such assessments may include the voluntary use of detection technologies and the evidence they provide with regard to the risks of a service being misused. One of the elements to be
Amendment 203 #
Proposal for a regulation Recital 22 (22)
Amendment 204 #
Proposal for a regulation Recital 22 (22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority
Amendment 205 #
Proposal for a regulation Recital 23 (23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a
Amendment 206 #
Proposal for a regulation Recital 23 (23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted, justified, proportionate, limited in time and specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
Amendment 207 #
Proposal for a regulation Recital 23 a (new) (23 a) Monitoring private communications of all users of a number- independent interpersonal communications service in a general and indiscriminate manner is likely to infringe on the essence of their fundamental rights and the prohibition of general monitoring. To the greatest extent possible, and as the predominant rule, detection orders should be targeted against users for whom there is a reasonable suspicion that they have been sharing child sexual abuse material in the past or that they will share child sexual abuse material in the future.
Amendment 208 #
Proposal for a regulation Recital 24 (24) The competent judicial authority
Amendment 209 #
Proposal for a regulation Recital 24 (24) The competent judicial authority
Amendment 210 #
Proposal for a regulation Recital 25 Amendment 211 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of
Amendment 212 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available number-independent interpersonal communications services to execute
Amendment 213 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users,
Amendment 214 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the
Amendment 215 #
Proposal for a regulation Recital 28 (28) With a view to constantly assess the performance of the
Amendment 216 #
Proposal for a regulation Recital 29 (29)
Amendment 217 #
Proposal for a regulation Recital 29 a (new) (29 a) In order to ensure effective prevention and fight against online child sexual abuse the providers should be able to make voluntary use of detection technologies as part of their mitigation measues, if they assess this as necessary in order to limit the risk of misuse.
Amendment 218 #
Proposal for a regulation Recital 29 b (new) (29 b) All relevant providers should provide for easily accessible, child- friendly and age appropriate notification mechanisms that allow for a quick, efficient and privacy-preserving notification. Micro, small and medium sized enterprises should get support from the EU Centre to build up a corresponding mechanism.
Amendment 219 #
Proposal for a regulation Recital 31 (31) The rules of this Regulation should not be understood as affecting the
Amendment 220 #
Proposal for a regulation Recital 32 (32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited.
Amendment 221 #
Amendment 222 #
Proposal for a regulation Recital 34 (34)
Amendment 223 #
Proposal for a regulation Recital 40 (40) In order to facilitate smooth and efficient communications by electronic means, including, where relevant, by acknowledging the receipt of such communications, relating to matters covered by this Regulation, providers of relevant information society services should be required to designate a single point of contact and to publish relevant information relating to that point of contact, including the languages to be used in such communications. In contrast to the provider’s legal representative, the point of contact should serve operational purposes and should not be required to have a physical location. Suitable conditions should be set in relation to the languages of communication to be specified, so as to ensure that smooth communication is not unreasonably complicated. For providers subject to the obligation to establish a compliance function and nominate compliance officers in accordance with Regulation (EU)
Amendment 224 #
Proposal for a regulation Recital 42 (42) Where relevant and convenient, subject to the choice of the provider of relevant information society services and the need to meet the applicable legal requirements in this respect, it should be possible for those providers to designate a single point of contact and a single legal representative for the purposes of Regulation (EU)
Amendment 225 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to
Amendment 226 #
Proposal for a regulation Recital 49 (49) In order to verify that the rules of this Regulation, in particular those on
Amendment 227 #
Proposal for a regulation Recital 50 Amendment 228 #
Proposal for a regulation Recital 55 (55) It is essential for the proper functioning of
Amendment 229 #
Proposal for a regulation Recital 55 a (new) (55 a) All communications containing illegal material should be encrypted to state of the art standards, all access by staff to such content should be limited to what is necessary and thoroughly logged.
Amendment 230 #
Proposal for a regulation Recital 56 (56) With a view to ensuring that the indicators generated by the EU Centre
Amendment 231 #
Proposal for a regulation Recital 69 a (new) (69 a) Hotlines play an invaluable role in providing the public with a way to report suspected child sexual abuse material and by rapidly removing harmful content online, but they have different legal rights to process child sexual abuse material and therefore Member Stats are encouraged to aim for a harmonisation of the legal capacities of hotlines.
Amendment 232 #
Proposal for a regulation Recital 70 (70) This Regulation recognises and reinforces the key role of hotlines in optimising the fight against child sexual abuse online at the Union level. Hotlines are at the forefront of detecting new child sexual abuse material and have a track record of proven capability in the rapid identification and removal of child sexual abuse material from the digital environment. Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union.
Amendment 233 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines, concluding, when necessary, strategic and/or operational cooperation agreements with them and encourage that they
Amendment 234 #
Proposal for a regulation Recital 78 (78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45provides for a temporary solution in respect of the use of technologies by certain providers of publicly available interpersonal communications services for the purpose of combating online child sexual abuse
Amendment 235 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in
Amendment 236 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to prevent and address the misuse of relevant information society services for online child sexual abuse
Amendment 237 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the
Amendment 238 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on relevant providers of hosting
Amendment 239 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services and providers of publicly available number-independent interpersonal communication services to ide
Amendment 240 #
Proposal for a regulation Article premier – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services
Amendment 241 #
(b) obligations on relevant providers of
Amendment 242 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point c (c) obligations on
Amendment 243 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d a (new) (d a) obligations on providers of online search engines to delist websites indicating child sexual abuse material;
Amendment 244 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point e a (new) (e a) Obligations on providers of online games;
Amendment 245 #
Proposal for a regulation Article 1 – paragraph 3 – point b (b) Directive 2000/31/EC and Regulation (EU)
Amendment 246 #
Proposal for a regulation Article premier – paragraph 3 – point b a (new) (ba) Regulation (EU) .../... [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts], in particular Article 5;
Amendment 247 #
Proposal for a regulation Article 1 – paragraph 4 4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the execution of the
Amendment 248 #
Proposal for a regulation Article 2 – paragraph 1 – point a (a) ‘hosting service’ means an information society service as defined in Article 2, point (f), third indent, of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]; in so far they allow the dissemination and sharing of images and videos;
Amendment 249 #
Proposal for a regulation Article 2 – paragraph 1 – point a (a) ‘hosting service’ means an information society service as defined in Article
Amendment 250 #
Proposal for a regulation Article 2 – paragraph 1 – point a (a) ‘hosting service’ means an information society service as defined in Article
Amendment 251 #
Proposal for a regulation Article 2 – paragraph 1 – point a a (new) (a a) 'cloud computing service' means a service as defined in Article 6, point (30), of Directive (EU) 2022/2555 of the European Parliament and of the Council.
Amendment 252 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘number-independent interpersonal communications service’ means a publicly available service as defined in Article 2, point
Amendment 253 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘interpersonal communications service’ means a publicly available service as defined in Article 2, point 5, of Directive (EU) 2018/1972, including services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service in so far they allow the dissemination and sharing of images and videos;
Amendment 254 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘interpersonal communications service’ means a publicly available service as defined in Article 2, point 5, of
Amendment 255 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘interpersonal communications service’ means a publicly available service as defined in Article 2, point 5, of Directive (EU) 2018/1972,
Amendment 256 #
Proposal for a regulation Article 2 – paragraph 1 – point b a (new) (b a) ‘number-independent interpersonal communications service within games’ means any service defined in Article 2, point 7 of Directive (EU) 2018/1972 which is part of a game.
Amendment 257 #
Proposal for a regulation Article 2 – paragraph 1 – point c (c) ‘software application’ means a digital product or service as defined in Article 2, point 1
Amendment 258 #
Proposal for a regulation Article 2 – paragraph 1 – point d (d) ‘software application store’ means a service as defined in Article 2, point 1
Amendment 259 #
Proposal for a regulation Article 2 – paragraph 1 – point e Amendment 260 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point ii Amendment 261 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point ii (ii) a
Amendment 262 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv Amendment 263 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv a (new) (iv a) online search engines;
Amendment 264 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv a (new) (iv a) online games;
Amendment 265 #
Proposal for a regulation Article 2 – paragraph 1 – point f a (new) (f a) “Online search engine” means an intermedietary service as defined in Article 3 point (j) of Regulation (EU) 2022/2065;
Amendment 266 #
Proposal for a regulation Article 2 – paragraph 1 – point f b (new) (f b) ‘metadata‘ means data processed for the purposes of transmitting, distributing or exchanging content data; including data used to trace and identify the source and destination of a communication, data on the location of the user, and the date, time, duration and the type of communication;
Amendment 267 #
Proposal for a regulation Article 2 – paragraph 1 – point g (g) ‘to offer services in the Union’ means to offer services in the Union as defined in Article 2
Amendment 268 #
Proposal for a regulation Article 2 – paragraph 1 – point h a (new) (h a) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous report from the public about alleged child sexual abuse material and online child sexual exploitation, which is officially recognised by the Member State of establishment and has the mission of combatting child sexual abuse;
Amendment 269 #
Proposal for a regulation Article 2 – paragraph 1 – point h b (new) (h b) ‘help-line’ means an organisation providing services for children in need as recognised by the Member State of establishment;
Amendment 270 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 1
Amendment 271 #
Proposal for a regulation Article 2 – paragraph 1 – point r (r) ‘recommender system’ means the system as defined in Article
Amendment 272 #
Proposal for a regulation Article 2 – paragraph 1 – point t (t) ‘content moderation’ means the activities as defined in Article
Amendment 273 #
Proposal for a regulation Article 2 – paragraph 1 – point v (v) ‘terms and conditions’ means terms
Amendment 274 #
Proposal for a regulation Article 2 a (new) Article 2 a End-to-End Encryption and Prohibition on General Monitoring 1. End-to-end encryption is essential to guarantee the security, confidentiality of the communications of users, including those of children. Any restrictions of encryption could lead to abuse by malicious actors. Nothing in this Regulation should be interpreted as prohibiting providers of information society services from providing their services applying end-to-end encryption, restricting or undermining such encryption. Member States should not prevent providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third party access. 2. Nothing in this Regulation should undermine the prohibition of general monitoring under EU law.
Amendment 275 #
Proposal for a regulation Article 2 a (new) Article 2 a Voluntary own-initiative detection Providers of relevant information society services shall be deemed eligible to carry out own-initiative investigations into, or take other measures aimed at detecting, identifying and preventing dissemination or removing child sexual abuse on their services in addition to mandatory requirements foreseen in this Regulation.
Amendment 276 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services
Amendment 277 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of number-independent interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of use of the service for the purpose of online child sexual abuse. This risk assessment shall be specific to their services and proportionate to the risks, taking into consideration their severity and probability and in full respect to the fundamental rights enshrined in the Charter.
Amendment 278 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of
Amendment 279 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of interpersonal communications services shall
Amendment 280 #
Proposal for a regulation Article 3 – paragraph 1 a (new) 1 a. A hosting service provider or publicly available number-independent interpersonal communication service is exposed to child sexual abuse material where the coordinating authority of the Member State of its main establishment or where its legal repr esentative resides or is established has: a) taken a decision, on the basis of objecti ve factors, such as the provider having rec eived two or more final removal orders in the previous 12 m onths, finding that the provider is exposed to child sexual abuse material;and b) notified the decision referred to in point (a) to the provider.
Amendment 281 #
Proposal for a regulation Article 3 – paragraph 2 – point a Amendment 282 #
Proposal for a regulation Article 3 – paragraph 2 – point a a (new) (a a) any actual or foreseeable negative effects for the exercise of fundamental rights
Amendment 283 #
Proposal for a regulation Article 3 – paragraph 2 – point b – introductory part (b) the existence and implementation by the provider of a policy and the availability and effectiveness of functionalities to address the
Amendment 284 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 1 Amendment 285 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 Amendment 286 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 — concrete measures taken to enforce such prohibitions and restrictions;
Amendment 287 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities enabling
Amendment 288 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities enabling
Amendment 289 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities enabling
Amendment 290 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities
Amendment 291 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 — functionalities enabling age
Amendment 292 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - Functionalities enabling scanning for known child sexual abuse material on upload;
Amendment 293 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - functionalities enabling age appropriate parental control;
Amendment 294 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 b (new) - Functionalities preventing uploads from the dark web;
Amendment 295 #
Proposal for a regulation Article 3 – paragraph 2 – point b a (new) (b a) the capacity, in accordance with the state of the art, to deal with reports and notifications about child sexual abuse in a timely manner;
Amendment 296 #
Proposal for a regulation Article 3 – paragraph 2 – point c Amendment 297 #
Proposal for a regulation Article 3 – paragraph 2 – point d (d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, the design of their recommender systems and any other relevant algorithmic system and the impact thereof on that risk;
Amendment 298 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point i Amendment 299 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii Amendment 300 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 1 — enabling users to publicly search for other users and, in particular, for adult users to search for child users;
Amendment 301 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 Amendment 302 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 — enabling users to
Amendment 303 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 — enabling users to establish direct contact and share images or videos with other users, in particular through private communications.
Amendment 304 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 a (new) - Enabling users to create usernames that contain a representation about, or imply, the user’s age;
Amendment 305 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 a (new) - the extent to which children have access to age-restricted content
Amendment 306 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 b (new) - Enabling child users to create usernames that contain location information on child users;
Amendment 307 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 c (new) - Enabling users to know or infer the location of child users.
Amendment 308 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii a (new) Amendment 309 #
Proposal for a regulation Article 3 – paragraph 2 a (new) 2a. The fact that a provider of interpersonal communications services ensures that interpersonal communications remain confidential or are encrypted cannot be considered a risk factor within the meaning of this Regulation.
Amendment 310 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 The provider may request the EU Centre to perform an analysis of representative, anonymized data samples to identify potential online child sexual abuse, to support the risk assessment. This request cannot serve the purpose of evading any of the provider’s obligations set up in this Regulation. The EU Centre shall perfom the analysis in a timely manner.
Amendment 311 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 The provider may request the EU Centre to perform an analysis of
Amendment 312 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise
Amendment 313 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise
Amendment 314 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 3 Amendment 315 #
Proposal for a regulation Article 3 – paragraph 3 a (new) 3 a. Providers of hosting services and providers of interpersonal communication services shall put forward specific age assurance verification systems that meet the following criteria: (a) effectively protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose (b) do not collect data that is not strictly necessary for the purposes of age assurance; (c) be proportionate to the risks associated to the product or service that presents a risk of misuse of child sexual abuse; (d) provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
Amendment 316 #
Proposal for a regulation Article 3 – paragraph 3 a (new) 3 a. The provider may also voluntary use the measures specified in Article 10 to detect online child sexual abuse on a specific service. In this case they have to notify the Coordinating authority and include the results of its analyses in a separate section of the risk assessment.
Amendment 317 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – point a (a) for a service which is subject to a
Amendment 318 #
Proposal for a regulation Article 3 – paragraph 6 6. The Commission, in cooperation with Coordinating Authorities
Amendment 320 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services
Amendment 321 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services, excluding cloud computing services, and providers of interpersonal communications services shall take reasonable, proportionate and targeted mitigation measures, tailored to the risk identified pursuant to Article 3 and the type of service offered, to minimise that risk. Such measures shall include some or all of the following:
Amendment 322 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall take the reasonable mitigation measures set out in Article 35 of Regulation (EU) 2022/2065, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
Amendment 323 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include, but need not to be limited to, some or all of the following:
Amendment 324 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications
Amendment 325 #
Proposal for a regulation Article 4 – paragraph 1 – point a (a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service,
Amendment 326 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (a a) providing technical measures and tools that allow users to manage their own privacy, visibility, reachability and safety , and that are set to the most secure levels by default;
Amendment 327 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (a a) introducing parental control features and functionalities that allow the parents or the legal guardians to exercise oversight and control over the child's activity;
Amendment 328 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (a a) adapting privacy and safety by design and by default for children, including age appropriate parental control tools;
Amendment 329 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (a b) informing users about external resources and services in the user’s region on preventing child sexual abuse, counselling by help-lines, victim support and educational resources by hotlines and child protection organisation;
Amendment 330 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (a b) implementing measures to prevent and combat the dissemination of online child sex abuse materials;
Amendment 331 #
Proposal for a regulation Article 4 – paragraph 1 – point a c (new) (a c) providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line
Amendment 332 #
Proposal for a regulation Article 4 – paragraph 1 – point a d (new) (a d) automatic mechanisms and interface design elements to inform users about external preventive intervention programmes
Amendment 333 #
Proposal for a regulation Article 4 – paragraph 1 – point b (b)
Amendment 334 #
Proposal for a regulation Article 4 – paragraph 1 – point b a (new) (b a) processing metadata;
Amendment 335 #
Proposal for a regulation Article 4 – paragraph 1 – point c Amendment 336 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU)
Amendment 337 #
Proposal for a regulation Article 4 – paragraph 1 – subparagraph 1 (new) Amendment 338 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (c a) foreseeing awareness-raising measures;
Amendment 339 #
Proposal for a regulation Article 4 – paragraph 1 – point c b (new) (c b) using any other measures in accordance with the current or future state of the art that are fit to mitigate the identified risk;
Amendment 340 #
Proposal for a regulation Article 4 – paragraph 2 – introductory part 2. The
Amendment 341 #
Proposal for a regulation Article 4 – paragraph 2 – point a (a) effective and proportionate in mitigating the identified serious risk;
Amendment 342 #
Proposal for a regulation Article 4 – paragraph 2 – point a (a) effective and efficient in mitigating the identified risk;
Amendment 343 #
Proposal for a regulation Article 4 – paragraph 2 – point a a (new) (a a) subject to an implementation plan with clear objectives and methodologies for identifying and quantifying impacts on the identified serious risk and on the exercise of the fundamental rights of all affected parties. The implementation plan shall be reviewed every six months.
Amendment 344 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) applied in line with the right to privacy and the safety of individuals, targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technological capabilities and the number of users;
Amendment 345 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk , any impact on the functionality of the service as well as the provider’s financial and technological capabilities and the number of users;
Amendment 346 #
Proposal for a regulation Article 4 – paragraph 2 – point d a (new) (d a) Providers of hosting services and providers of interpersonal communications services are encouraged to put in place voluntary measures to detect and report online child sexual abuse for those services that have proven to pose a risk of misuse for child sexual abuse, or in cases there is an imminent risk of misue for child sexual abuse, including for the purpose of the solicitation of children;
Amendment 347 #
Proposal for a regulation Article 4 – paragraph 2 a (new) 2a. The requirement that the providers of interpersonal communications services take risk mitigation measures shall in no way constitute a requirement that they access the content of communications or make provision for methods to access these communications or to compromise their encryption.
Amendment 348 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 349 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 350 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures and to put in place effective measure to block the access of children to websites that fall under an age-restriction applicable under national law.
Amendment 351 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age
Amendment 352 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary
Amendment 353 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of number-independant interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary
Amendment 354 #
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall immediately take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures.
Amendment 355 #
Proposal for a regulation Article 4 – paragraph 3 a (new) 3 a. Any requirement to take specific measures shall be without prejudice to Article 8 of Regulation (EU) 2022/2065 and shall entail neither a general obligation for hosting services providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.
Amendment 356 #
Proposal for a regulation Article 4 – paragraph 4 4. Providers of hosting services and providers of interpersonal communications services shall clearly describe in their terms and conditions the mitigation measures that they have taken. That description shall not include information that may reduce the effectiveness of the mitigation measures, unless the measures impinge on the essence of the service underlying the contract of use, or unless they amend, derogate from or invalidate another clause in the provider's terms and conditions.
Amendment 357 #
Proposal for a regulation Article 4 – paragraph 4 4.
Amendment 358 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities and the EU
Amendment 359 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1
Amendment 360 #
Proposal for a regulation Article 4 a (new) Article 4 a Legal basis for risk mitigation through metadata processing 1. To the extent necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, providers of number independent interpersonal communication services shall be allowed, as a mitigating measure under Article 4, to process metadata. 2. All relevant service providers shall process metadata when ordered to do so by the Coordinating Authority of establishment in accordance with Article 5bis(4). When assessing whether to require a provider to process metadata, the Coordinating Authority shall take into account the interference with the rights to privacy and data protection of the users of the service that such a processing entails and determine whether, in the case at hand, the processing of metadata would be effective in mitigating the risk of use of the service for the purpose of child sexual abuse, strictly necessary and proportionate. 3. If they process metadata as a risk mitigation measure, providers shall inform their users of such processing in their terms and conditions, including information on the possibility to submit complaints.
Amendment 361 #
Proposal for a regulation Article 4 a (new) Article 4 a Specific measures for platforms primarily used for the dissemination of pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure a. user-friendly reporting mechanisms to report alleged child sexual abuse material; b. adequate professional human content moderation to rapidly process notices of alleged child sexual abuse material; c. automatic mechanisms and interface design elements to inform users about external preventive intervention programmes in the user’s region.
Amendment 362 #
Proposal for a regulation Article 4 b (new) Article 4 b Specific measures for number- independent interpersonal communications service within games Providers of online games that operate number-independent interpersonal communications service within their games shall take the necessary technical and organisational measures a) preventing users from initiating unsolicited contact with other users; b) facilitating user-friendly reporting of alleged child sexual abuse material; c) providing technical measures and tools that allow users to manage their own privacy, visibility reachability and safety. and that are set to the most secure levels by default; d) providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line.
Amendment 363 #
Proposal for a regulation Article 5 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 364 #
Proposal for a regulation Article 5 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall transmit,
Amendment 365 #
Proposal for a regulation Article 5 – paragraph 1 – point a (a)
Amendment 366 #
Proposal for a regulation Article 5 – paragraph 1 – point b (b) any
Amendment 367 #
Proposal for a regulation Article 5 – paragraph 2 2. Within three months after receiving the report, the Coordinating Authority of establishment shall assess it and determine, on that basis and taking into account any other relevant information available to it, whether the risk assessment has been carried out or updated and the
Amendment 368 #
Proposal for a regulation Article 5 – paragraph 2 2. Within
Amendment 369 #
Proposal for a regulation Article 5 – paragraph 3 – subparagraph 1 Where necessary for that assessment, that Coordinating Authority may require further information from the provider,
Amendment 370 #
Proposal for a regulation Article 5 – paragraph 3 – subparagraph 2 Amendment 371 #
Proposal for a regulation Article 5 – paragraph 4 4. Without prejudice to Articles 7 and 27 to 29, where the requirements of Articles 3 and 4 have not been met, that Coordinating Authority shall require the provider to
Amendment 372 #
Proposal for a regulation Article 5 – paragraph 6 Amendment 373 #
Proposal for a regulation Article 5 – paragraph 6 a (new) 6 a. Providers of hosting services and providers of interpersonal communications services that qualify as micro (or small) enterprises within the meaning of Article 3 of Directive 2013/34/EU shall transmit a simplified version of the report under paragraph 1 of this Article.
Amendment 374 #
Proposal for a regulation Article 5 a (new) Amendment 375 #
Proposal for a regulation Article 6 Amendment 376 #
Amendment 377 #
Proposal for a regulation Article 6 – paragraph 1 – point a (a) make reasonable efforts to
Amendment 378 #
Proposal for a regulation Article 6 – paragraph 1 – point b Amendment 379 #
Proposal for a regulation Article 6 – paragraph 1 – point b (b) take reasonable and effective measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children leading to an online child sex abuse;
Amendment 380 #
Proposal for a regulation Article 6 – paragraph 1 – point c Amendment 381 #
Proposal for a regulation Article 6 – paragraph 1 – point c (c) take the necessary age
Amendment 382 #
Proposal for a regulation Article 6 – paragraph 2 Amendment 383 #
Proposal for a regulation Article 6 – paragraph 3 Amendment 384 #
Proposal for a regulation Article 6 – paragraph 4 Amendment 385 #
Proposal for a regulation Article 6 a (new) Article 6 a Security and confidentiality of communications Nothing in this Regulation shall be construed as prohibiting, restricting or undermining the provision or the use of encrypted services. Member States shall not prevent or discourage providers of relevant information society services from offering encrypted services.
Amendment 386 #
Article 6 a Encrypted services Nothing in this Regulation shall be construed as prohibiting, restricting or undermining the provision or the use of encrypted services. Providers of information society services shall not be deterred nor prevented by relevant public authorities from offering encrypted services.
Amendment 387 #
Proposal for a regulation Article 6 a (new) Article 6 a Encrypted services Member States shall not prevent providers of relevant information society services from offering encrypted services. But when offering them, providers have to make sure that they process metadata in order to detect known child sexual abuse material.
Amendment 388 #
Proposal for a regulation Article 6 a (new) Article 6 a Security of communications and services Nothing in this regulation shall be construed as encouraging the prohibition, restriction, circumvention or undermining of the provision or the use of encrypted services.
Amendment 389 #
Proposal for a regulation Article 6 b (new) Article 6 b Support for micro and small and medium sized enterprises The Commission shall be empowered to adopt delegated acts in accordance with Article 86 in order to supplement this Regulation with guidelines that foresee practical support for micro and small and medium sized enterprises in order for them to be able to fulfil the obligations of this Regulation.
Amendment 391 #
Proposal for a regulation Chapter II – Section 2 – title 2
Amendment 394 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power, as a last resort, when all the measures in Article 3, 4 and 5 have been exhausted, to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue
Amendment 395 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power, as a last resort, when all the measures in Article 3, 4 and 5 have been exhausted, to
Amendment 396 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 397 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 398 #
Proposal for a regulation Article 7 – paragraph 1 1.
Amendment 399 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The Coordinating Authority of establishment shall, before requesting the issuance of a detection order, carry out the investigations and assessments necessary to determine whether the conditions of paragraph 4 have been met. Detection orders issued by the coordinating authorities shall serve as a measure of last resort, only enacted when all mitigating measures, including voluntary ones, have proven unsuccessful.
Amendment 400 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 Amendment 401 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The Coordinating Authority of establishment shall, before requesting the issuance of a
Amendment 402 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – introductory part Where the Coordinating Authority of establishment takes the preliminary view that the conditions of paragraph 4 have been met and the measures envisaged in the detection order are proportionate, it shall:
Amendment 403 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – introductory part Where the Coordinating Authority of establishment takes the
Amendment 404 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point a (a) establish a draft request for the issuance of a
Amendment 405 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point b (b) submit the draft request to the concerned provider and the EU Centre;
Amendment 406 #
Amendment 407 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point d (d) invite the EU Centre to provide its opinion on the draft request, within a time period of
Amendment 408 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – introductory part Where, having regard to the
Amendment 409 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the measures it envisages taking to execute the intended detection order, including detailed information regarding the envisaged technologies and safeguards and their negative impacts on the rights of all parties involved, including the users of the service;
Amendment 410 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the incident that the authority intends to investigate, the measures it envisages taking to execute the intended
Amendment 411 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b)
Amendment 412 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b) where the draft implementation plan concerns an intended
Amendment 413 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b)
Amendment 414 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessment and in order to
Amendment 415 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point d (d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted
Amendment 416 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the implementation plan of the provider and the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the issuance of the detection, adjusted where appropriate, to the competent judicial
Amendment 417 #
Where, having regard to the implementation plan of the provider and taking taking utmost account of the opinion of the data protection authority, that Coordinating Authority
Amendment 418 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the implementation plan of the provider and the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the issuance of the detection order, adjusted where appropriate, to the competent judicial
Amendment 419 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority
Amendment 420 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority
Amendment 421 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part Based on a reasoned justification, The Coordinating Authority of establishment shall request the issuance of the
Amendment 422 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is evidence of a s
Amendment 423 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b (b) the reasons for issuing the detection order outweigh negative consequences for the rights and legitimate interests of all parties affected, including all users where the implementation plan would undermine the structure processing the interpersonal communications, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties.
Amendment 424 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b (b) the
Amendment 425 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b a (new) (b a) The voluntary measures applied as mitigating measures have not proven successful in preventing the misuse of the service for child sexual abuse.
Amendment 426 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 Amendment 427 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point a Amendment 428 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point b Amendment 429 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point c Amendment 430 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point d Amendment 431 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 3 Amendment 432 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 3 As regards the second subparagraph, point (d), where that Coordinating Authority substantially deviates from the opinion of the EU Centre, it shall inform the EU Centre and the Commission thereof, specifying in detail the points at which it deviated and the main reasons for the deviation.
Amendment 433 #
Proposal for a regulation Article 7 – paragraph 5 – introductory part 5. As regards
Amendment 434 #
Proposal for a regulation Article 7 – paragraph 5 – introductory part 5. As regards detection orders concerning the dissemination of known child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall
Amendment 435 #
Proposal for a regulation Article 7 – paragraph 5 – point a (a) it is likely, despite any mitigation measures that the provider may have taken
Amendment 436 #
Proposal for a regulation Article 7 – paragraph 5 – point b (b) there is evidence of the service
Amendment 437 #
Proposal for a regulation Article 7 – paragraph 5 – point b (b) there is evidence of the service,
Amendment 438 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 439 #
Proposal for a regulation Article 7 – paragraph 6 – introductory part 6. As regards detection orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall
Amendment 440 #
Proposal for a regulation Article 7 – paragraph 6 – point a (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material, including live stream and live transmission;
Amendment 441 #
Proposal for a regulation Article 7 – paragraph 6 – point b (b) there is evidence of the service
Amendment 442 #
Proposal for a regulation Article 7 – paragraph 7 Amendment 443 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – introductory part As regards detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall
Amendment 444 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – point c (c) there is evidence of the service
Amendment 445 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary to effectively address the significant risk referred to in point (a) thereof. To the greatest extent possible, the detection order should be targeted against users who can be reasonably suspected of distributing child sexual abuse material.
Amendment 446 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the judicial validation and the issuance of
Amendment 447 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial
Amendment 448 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent
Amendment 449 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that
Amendment 450 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point b (b) where necessary, in particular to limit such negative consequences, effective
Amendment 451 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point b a (new) (ba) under no circumstances, shall the detection order require providers of interpersonal communications services to access the content of communications or make provision for methods to access these communications or to compromise their encryption;
Amendment 452 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority or independent administrative authority shall specify in the detection order the period during which it applies, indicating the start date and the end date, within which the providers of hosting services and providers of interpersonal communications services shall prove that their service is no longer used for child sexual abuse.
Amendment 453 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 454 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 455 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 456 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 2 The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the
Amendment 457 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 The period of application of
Amendment 458 #
Amendment 459 #
Proposal for a regulation Article 8 – title Additional rules regarding
Amendment 460 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 461 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 462 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 463 #
Proposal for a regulation Article 8 – paragraph 1 – point a (a) information regarding the measures to be taken to execute the
Amendment 464 #
Proposal for a regulation Article 8 – paragraph 1 – point b (b) identification details of the competent judicial authority
Amendment 465 #
Proposal for a regulation Article 8 – paragraph 1 – point b (b) identification details of the competent judicial authority
Amendment 466 #
Proposal for a regulation Article 8 – paragraph 1 – point b Amendment 467 #
Proposal for a regulation Article 8 – paragraph 1 – point c (c) the name of the provider and, where applicable, its legal representative, without prejudice to the issuance of detection orders where the legal name of the provider is not readily ascertained;
Amendment 468 #
Proposal for a regulation Article 8 – paragraph 1 – point d (d) the specific service in respect of which the
Amendment 469 #
Proposal for a regulation Article 8 – paragraph 1 – point e (e) whether the
Amendment 470 #
Proposal for a regulation Article 8 – paragraph 1 – point f (f) the start date and the end date of the
Amendment 471 #
Proposal for a regulation Article 8 – paragraph 1 – point g (g) a
Amendment 472 #
Proposal for a regulation Article 8 – paragraph 1 – point h (h) the factual and legal grounds justifying the issuing of the order, and a reference to this Regulation as the legal basis for the detection order;
Amendment 473 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 474 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 475 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 476 #
Proposal for a regulation Article 8 – paragraph 1 – point j (j) easily understandable information about the redress available to the addressee of the
Amendment 477 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 478 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 479 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 480 #
The
Amendment 481 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 3 The
Amendment 482 #
Proposal for a regulation Article 8 – paragraph 3 3. If the provider cannot execute the
Amendment 483 #
Proposal for a regulation Article 8 a (new) Amendment 484 #
Proposal for a regulation Article 8 c (new) Article 8 c Notification mechanism 1. Providers of hosting services and providers of interpersonal communication services shall establish mechanisms that allow users to notify to them the presence on their service of specific items or activities that the user considers to be potential child sexual abuse material, in particular previously unknown child sexual abuse material and solicitation of children. Those mechanisms shall be easy to access and user-friendly, child-friendly and shall allow for the submission of notices exclusively by electronic means. 2. Where the notice contains the electroni c contact information of the user who submitted it , the provider shall without undue delay send a confirmation or receipt to the user. 3. Providers shall ensure that such notices are processed without undue delay.
Amendment 486 #
Proposal for a regulation Article 9 – title Redress, information, reporting and modification of
Amendment 487 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services that have received a
Amendment 488 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of interpersonal communications services that have received a detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detection order before the courts of the Member State of the competent judicial authority
Amendment 489 #
Proposal for a regulation Article 9 – paragraph 1 a (new) 1a. Exercising the right to recourse under paragraph 1 shall suspend execution of the detection order.
Amendment 490 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the detection order becomes final, the competent judicial authority
Amendment 491 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the detection order becomes final, the competent judicial authority
Amendment 492 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the
Amendment 493 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 2 For the purpose of the first subparagraph, a
Amendment 494 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 1 Where the period of application of the
Amendment 495 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 2 Those reports shall include a detailed description of the measures taken to execute the
Amendment 496 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the
Amendment 497 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the detection orders that the competent judicial authority
Amendment 498 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the detection orders that the competent judicial authority
Amendment 499 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 500 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 501 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 504 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of publicly available number- independent interpersonal communication services that have received a
Amendment 505 #
Proposal for a regulation Article 10 – paragraph 2 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies specified in the orders and made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the
Amendment 506 #
Proposal for a regulation Article 10 – paragraph 3 – introductory part 3. The technologies specified in the investigation orders shall be:
Amendment 507 #
Proposal for a regulation Article 10 – paragraph 3 – point a (a) effective in
Amendment 508 #
Proposal for a regulation Article 10 – paragraph 3 – point b (b) not be able to extract any other information from the relevant communications than the information strictly necessary to
Amendment 509 #
Proposal for a regulation Article 10 – paragraph 3 – point b a (new) (ba) respect the confidentiality of communications enshrined in Article 7 of the Charter of Fundamental Rights of the European Union and Article 8 of the Convention for the Protection of Human Rights and Fundamental Freedoms;
Amendment 510 #
Proposal for a regulation Article 10 – paragraph 3 – point c (c) in accordance with the technological state of the art
Amendment 511 #
Proposal for a regulation Article 10 – paragraph 3 – point d (d) sufficiently reliable, in that they limit to the maximum extent possible the rate of errors regarding the
Amendment 512 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (d a) effective in setting up a reliable age-based filter that verifies the age of users and effectively prevents the access of child users to websites subject to online child sexual abuse, and child sexual abuse offenses.
Amendment 513 #
Proposal for a regulation Article 10 – paragraph 4 – introductory part 4. The
Amendment 514 #
Proposal for a regulation Article 10 – paragraph 4 – point a (a) take all the necessary measures to ensure that the technologies
Amendment 515 #
Proposal for a regulation Article 10 – paragraph 4 – point a (a) take all the necessary and proportionate measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly limited to what is necessary to execute the detection orders addressed to them;
Amendment 516 #
Proposal for a regulation Article 10 – paragraph 4 – point b (b)
Amendment 517 #
Proposal for a regulation Article 10 – paragraph 4 – point c (c) include in investigation orders specific obligations on providers to ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, human intervention;
Amendment 518 #
Proposal for a regulation Article 10 – paragraph 4 – point c (c) ensure regular human oversight as necessary to ensure that the technologies operate
Amendment 519 #
Proposal for a regulation Article 10 – paragraph 4 – point c a (new) (c a) ensure privacy and safety by design and by default and, where applicable, the protection of encryption.
Amendment 520 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of
Amendment 521 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, the refusal of removal, especially of self-generated CSAM, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
Amendment 522 #
Proposal for a regulation Article 10 – paragraph 4 – point e (e) inform the Coordinating Authority, as appropriate, at the latest one month before the start date specified in the
Amendment 523 #
Proposal for a regulation Article 10 – paragraph 4 – point e a (new) (e a) Ensure safety-by-design tools such as parental controls tool and effective age verification tools.
Amendment 524 #
Proposal for a regulation Article 10 – paragraph 4 – point e a (new) (e a) ensure privacy and safety by design and by default and, where applicable, the protection of encryption;
Amendment 525 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point a Amendment 526 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point b Amendment 527 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 2 Amendment 528 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 2 The provider shall not provide information to users that may reduce the effectiveness of the measures to execute the
Amendment 529 #
Proposal for a regulation Article 10 – paragraph 6 Amendment 530 #
Proposal for a regulation Article 11 Amendment 531 #
Proposal for a regulation Article 11 – title Guidelines regarding
Amendment 532 #
Proposal for a regulation Article 11 – paragraph 1 The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue
Amendment 533 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of
Amendment 534 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the
Amendment 535 #
Proposal for a regulation Article 12 – paragraph 1 a (new) 1 a. Where a provider of hosting services or a provider of interpersonal communications services receives a report by the public through, among others, trusted hotline, it shall process and analyse the report in a timely and effective manner as to assess an imminent risk of miuse of the service for child child sexual abuse, without prejudice to the obligation to report to the EU centre pursuant paragraph 1.
Amendment 536 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 1 Where the provider submits a report pursuant to paragraph 1, it shall
Amendment 537 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 2 Amendment 538 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 3 Amendment 539 #
Proposal for a regulation Article 12 – paragraph 2 a (new) 2 a. The report submitted by the provider pursuant paragrah 2, shall never contain information about the source of the report, especially when this stems from the person to whom the material relates.
Amendment 540 #
Proposal for a regulation Article 12 – paragraph 3 3.
Amendment 541 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to flag to the provider potential online child sexual abuse on the service, including child-friendly mechanisms of self- generated content self-reporting.
Amendment 542 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible, effective, age- appropriate and user-friendly mechanism that allows users to flag to the provider potential online child sexual abuse on the service.
Amendment 543 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to easily flag to the provider potential online child sexual abuse on the service.
Amendment 544 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 545 #
Proposal for a regulation Article 13 – paragraph 1 – point c (c) all
Amendment 546 #
Proposal for a regulation Article 13 – paragraph 1 – point d (d) a list of all available data other than content data related to the potential online child sexual abuse preserved in line with the preservation order in Article 8a;
Amendment 547 #
Proposal for a regulation Article 13 – paragraph 1 – point d a (new) (d a) a list of all traffic data and metadata retained in relation to the potential online child sexual abuse, which could be made available to law enforcement authorities, together with information concerning default storage periods.
Amendment 548 #
Proposal for a regulation Article 13 – paragraph 1 – point e Amendment 549 #
Proposal for a regulation Article 13 – paragraph 1 – point f Amendment 550 #
Proposal for a regulation Article 13 – paragraph 1 – point g Amendment 551 #
Proposal for a regulation Article 13 – paragraph 1 – point i (i) where the
Amendment 552 #
Proposal for a regulation Article 13 – paragraph 1 – point j (j)
Amendment 553 #
Proposal for a regulation Article 14 – paragraph 1 1.
Amendment 554 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 555 #
Proposal for a regulation Article 14 – paragraph 1 a (new) 1 a. Before issuing a removal order, the Coordinating Authority of establishment shall take all reasonable steps to ensure that implementing the order will not interfere with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 556 #
Proposal for a regulation Article 14 – paragraph 2 Amendment 557 #
Proposal for a regulation Article 14 – paragraph 2 2. The provider shall execute the removal order as soon as possible and in any event within no more than 24 hours of receipt thereof.
Amendment 558 #
Proposal for a regulation Article 14 – paragraph 3 – introductory part 3. The competent judicial authority
Amendment 559 #
Proposal for a regulation Article 14 – paragraph 3 – point a (a) identification details of the judicial
Amendment 560 #
Proposal for a regulation Article 14 – paragraph 3 – point b (b) the name of the provider and, where applicable, of its legal representative, without prejudice to the issuance of removal orders where the legal name of the provider is not readily ascertained;
Amendment 561 #
Proposal for a regulation Article 14 – paragraph 3 – point c Amendment 562 #
Proposal for a regulation Article 14 – paragraph 3 – point h (h) the date, time stamp and electronic signature of the judicial
Amendment 563 #
Proposal for a regulation Article 14 – paragraph 3 a (new) 3 a. Providers of hosting services or providers of interpersonal communication services shall be encouraged to extend the effect of the order regarding one or more specific items of material, referred to in paragraph 1, to any provider or services under their control and promptly inform the Coordinating Authority of establishment of this specific measure.
Amendment 564 #
Proposal for a regulation Article 15 – paragraph 1 1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority
Amendment 565 #
Proposal for a regulation Article 15 – paragraph 1 a (new) Amendment 566 #
Proposal for a regulation Article 15 – paragraph 2 – subparagraph 1 When the removal order becomes final, the competent judicial authority
Amendment 567 #
Proposal for a regulation Article 15 – paragraph 3 – point b (b) the reasons for the removal or disabling, providing a copy of the removal order
Amendment 568 #
Proposal for a regulation Article 15 – paragraph 4 Amendment 569 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 1 The Coordinating Authority of establishment may request, when requesting the judicial authority
Amendment 570 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 2 – point a (a) the judicial authority
Amendment 571 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 2 – point c (c) that judicial authority
Amendment 572 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 3 That judicial authority
Amendment 573 #
Proposal for a regulation Article 15 a (new) Article 15 a Delisting orders 1. The competent authority shall have the power to issue an order requiring a provider of online search engines under the jurisdiction of that Member State to take reasonable measures to delist a Uniform Resource Locator corresponding to online locations where child sexual abuse material can be found from appearing in search results. 2. The provider shall execute the delisting order without undue delay. The provider shall take the necessary measures to ensure that it is capable of reinstating the Uniform Resource Locator to appear in search results. 3. Before issuing a delisting order, the issuing authority shall inform the provider, if necessary via the Coordinating Authority, of its intention to do so specifying the main elements of the content of the intended delisting order and the reasons for its intention. It shall afford the provider an opportunity to comment on that information, within a reasonable time period set by that authority. 4. A delisting order shall be issued where the following conditions are met: (a) the delisting is necessary to prevent the dissemination of the child sexual abuse material in the Union, having regard in particular to the need to protect the rights of the victims; (b) all necessary investigations and assessments, including of search results, have been carried out to ensure that the Uniform Resource Locator to be delisted correspond, in a sufficiently reliable manner, to online locations where child sexual abuse material can be found. 5. The issuing authority shall specify in the delisting order the period during which it applies, indicating the start date and the end date. The period of application of delisting orders shall not exceed five years. 6. The Coordinating Authority or the issuing authority shall, where necessary and at least once every year, assess whether any substantial changes to the grounds for issuing the delisting orders have occurred and whether the conditions of paragraph 4 continue to be met.
Amendment 574 #
Proposal for a regulation Article 15 b (new) Article 15 b Redress and provision of information 1. Providers of online search engines that have received a delisting order shall have a right to effective redress. That right shall include the right to challenge the delisting order before the courts of the Member State of the authority that issued the delisting order. 2. If the order is modified or repealed as a result of a redress procedure, the provider shall immediately reinstate the delisted Uniform Resource Locator to appearing in search results. 3. When the delisting order becomes final, the issuing authority shall, without undue delay, transmit a copy thereof to the Coordinating Authority. The Coordinating Authority shall then, without undue delay, transmit copies thereof to all other Coordinating Authorities and the EU Centre through the system established in accordance with Article 39(2). For the purpose of the first subparagraph, a delisting order shall become final upon the expiry of the time period for appeal where no appeal has been lodged in accordance with national law or upon confirmation of the delisting order following an appeal. 4. Where a provider prevents users from obtaining search results for child sexual abuse material corresponding to Uniform Resource Locator pursuant to a delisting order, it shall take reasonable measures to inform those users of the following: (a) the fact that it does so pursuant to a delisting order; (b) the right of providers of delisted Uniform Resource Locators corresponding to blocked online locations to judicial redress referred to in paragraph 1 and the users’ right to submit complaints to the Coordinating Authority in accordance with Article 34.
Amendment 575 #
Proposal for a regulation Article 19 Amendment 576 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation
Amendment 577 #
Proposal for a regulation Article 19 – paragraph 1 Amendment 578 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services and where applicable cloud computing services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 579 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services and where applicable cloud computing services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
Amendment 580 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters
Amendment 581 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to efficiently handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and communicate it to the EU Centre. They shall keep that information updated.
Amendment 582 #
Proposal for a regulation Article 25 – paragraph 7 – introductory part 7. Coordinating Authorities may, where necessary for the performance of their tasks under this Regulation, request the assistance of the EU Centre in carrying out those tasks
Amendment 583 #
Proposal for a regulation Article 25 – paragraph 7 – point a Amendment 584 #
Proposal for a regulation Article 25 – paragraph 7 – point b Amendment 585 #
Proposal for a regulation Article 25 – paragraph 7 – point c Amendment 586 #
Proposal for a regulation Article 25 – paragraph 7 – point d Amendment 587 #
Proposal for a regulation Article 25 – paragraph 8 8. The EU Centre shall provide such assistance without undue delay, free of charge and in accordance with its tasks and obligations under this Regulation and insofar as its resources and priorities allow.
Amendment 588 #
Proposal for a regulation Article 25 – paragraph 8 8. The EU Centre shall provide such assistance free of charge and in accordance with its tasks and obligations under this Regulation
Amendment 589 #
Proposal for a regulation Article 26 – paragraph 1 1. Member States shall ensure that the Coordinating Authorities that they designated perform their tasks under this Regulation in an objective, impartial, transparent and timely manner, while fully respecting the fundamental rights of all parties affected. Member States shall
Amendment 590 #
Proposal for a regulation Article 26 – paragraph 2 – introductory part 2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the
Amendment 591 #
Proposal for a regulation Article 26 – paragraph 2 – point a Amendment 592 #
Proposal for a regulation Article 26 – paragraph 2 – point a Amendment 593 #
Proposal for a regulation Article 26 – paragraph 2 – point d (d) neither seek nor take instructions from any
Amendment 594 #
Proposal for a regulation Article 26 – paragraph 2 – point e Amendment 595 #
Proposal for a regulation Article 26 – paragraph 2 – point e Amendment 596 #
Proposal for a regulation Article 26 – paragraph 3 Amendment 597 #
Proposal for a regulation Article 26 – paragraph 3 3. Paragraph 2 shall not prevent supervision of the Coordinating Authorities in accordance with national constitutional law
Amendment 598 #
Proposal for a regulation Article 26 – paragraph 4 4. The Coordinating Authorities shall ensure that relevant members of staff have the required qualifications, experience, integrity and technical skills to perform their duties.
Amendment 599 #
Proposal for a regulation Article 26 – paragraph 5 5. Without prejudice to national or Union legislation on whistleblower protection, The management and other staff of the Coordinating Authorities shall, in accordance with Union or national law, be subject to a duty of professional secrecy both during and after their term of office, with regard to any confidential information which has come to their knowledge in the course of the performance of their tasks. Member States shall ensure that the management and other staff are subject to rules guaranteeing that they can carry out their tasks in an objective, impartial and independent manner, in particular as regards their appointment, dismissal, remuneration and career prospects.
Amendment 600 #
Proposal for a regulation Article 27 – paragraph 1 – point a (a) the power to require those providers, as well as any other persons
Amendment 601 #
Proposal for a regulation Article 27 – paragraph 1 – point b (b) the power to carry out on-site inspections of any premises that those providers or the other persons referred to in point (a) use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement of this Regulation in any form, irrespective of the storage medium, excluding content protected by confidentiality of correspondence for which authorisation by a judicial authority is required;
Amendment 602 #
Proposal for a regulation Article 27 – paragraph 1 – point b (b) the power to carry out remote or on-site
Amendment 603 #
Proposal for a regulation Article 27 – paragraph 1 – point d (d) the power to request information,
Amendment 604 #
Proposal for a regulation Article 28 – paragraph 1 – point b (b) the power to order specific measures to bring about the cessation of infringements of this Regulation and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end;
Amendment 605 #
Proposal for a regulation Article 29 – paragraph 1 – point b (b) the infringement persists and;
Amendment 606 #
Proposal for a regulation Article 29 – paragraph 2 – point a – point i (i) adopt and submit an action plan setting out the necessary measures to terminate the infringement , subject to the approval of the Coordinating Authority;
Amendment 607 #
Proposal for a regulation Article 29 – paragraph 2 – point b – introductory part (b) request the competent judicial authority
Amendment 608 #
Proposal for a regulation Article 29 – paragraph 2 – point b – point ii (ii) the infringement persists and causes serious harm that is greater than the likely harm to users relying on the service for legal purposes and;
Amendment 609 #
Proposal for a regulation Article 29 – paragraph 4 – subparagraph 3 – point a (a) the provider has failed to take
Amendment 610 #
Proposal for a regulation Article 30 – paragraph 2 2. Member States shall ensure that any exercise of the investigatory and enforcement powers referred to in Articles
Amendment 611 #
Proposal for a regulation Article 31 – paragraph 1 Coordinating Authorities shall have the power to carry out searches on publicly accessible material on hosting services to detect the dissemination of known or new child sexual abuse material, using the indicators contained in the databases referred to in Article 44(1), points (a) and (b)
Amendment 612 #
Proposal for a regulation Article 32 Amendment 613 #
Proposal for a regulation Article 32 a (new) Article 32 a Public awareness campaigns Coordinating authorities shall in cooperation with the EU Center regularly carry out public awareness campaigns to inform about measures to prevent and combat child sexual abuse online and offline and how to seek child-fiendly and age appropriate reporting and assistance and to inform about victims rights.
Amendment 614 #
Proposal for a regulation Article 34 – paragraph 2 a (new) 2a. Where national law does not grant a minor the legal capacity to lodge a complaint, his or her legal representative may do so on his or her behalf.
Amendment 615 #
Proposal for a regulation Article 35 – paragraph 2 2. Member States shall ensure that the maximum amount of penalties imposed for an infringement of this Regulation shall not exceed 6 % of the annual
Amendment 616 #
Proposal for a regulation Article 35 – paragraph 3 3. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify
Amendment 617 #
Proposal for a regulation Article 35 – paragraph 4 4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily
Amendment 618 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – introductory part Coordinating Authorities shall submit to the EU Centre, without undue delay and through the system established in accordance with Article 39(2), the evidence gathered through the procedures provided for in this Regulation:
Amendment 619 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point a (a) anonymised specific items of material and transcripts of conversations related to a specific person, specific group of people,or specific incident that Coordinating Authorities or that the competent judicial authorities or other independent administrative authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material or the solicitation of children, as applicable, for the EU Centre to generate indicators in accordance with Article 44(3);
Amendment 620 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point b (b) exact uniform resource locators indicating specific items of material related to a specific person, specific group of people,or specific incident that Coordinating Authorities or that competent judicial authorities or other independent administrative authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material, hosted by providers of hosting services not offering services in the Union, that cannot be removed due to those providers’ refusal to remove or disable access thereto and to the lack of cooperation by the competent authorities of the third country having jurisdiction, for
Amendment 621 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 2 Member States shall take the necessary measures to ensure that the Coordinating Authorities that they designated receive, without undue delay,the encrypted copies of the material identified as child sexual abuse material, the transcripts of conversations related to a specific person, specific group of people,or specific incident identified as the solicitation of children, and the uniform resource locators, identified by a competent judicial authority or other independent administrative authority than the Coordinating Authority, for submission to the EU Centre in accordance with the first subparagraph.
Amendment 622 #
Proposal for a regulation Article 37 – paragraph 1 – subparagraph 2 Where
Amendment 623 #
Proposal for a regulation Article 37 – paragraph 2 – point c (c) any other information that the Coordinating Authority that sent the request, or the Commission, considers relevant, including, where appropriate, information gathered on its own initiative
Amendment 624 #
Proposal for a regulation Article 37 – paragraph 3 – subparagraph 1 The Coordinating Authority of establishment shall assess the suspected infringement, taking into utmost account the request
Amendment 625 #
Proposal for a regulation Article 37 – paragraph 3 – subparagraph 2 Where it considers that it has insufficient information to asses the suspected infringement or to act upon the request
Amendment 626 #
Proposal for a regulation Article 37 – paragraph 4 4. The Coordinating Authority of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation referred to in paragraph 1, communicate to the Coordinating Authority that sent the request, or the Commission, the outcome of its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and, where applicable,
Amendment 627 #
Proposal for a regulation Article 38 – paragraph 2 a (new) 2 a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of online child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to efficiently detect, remove and block content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
Amendment 628 #
Proposal for a regulation Article 39 – paragraph 1 1. Coordinating Authorities shall efficiently cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre
Amendment 629 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services.
Amendment 630 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services.
Amendment 631 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies, hotlines and providers of relevant information society services.
Amendment 632 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.
Amendment 633 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.
Amendment 634 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies, hotlines and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.
Amendment 635 #
Proposal for a regulation Article 39 – paragraph 3 a (new) 3 a. Where the EU Centre receives a report from a hotline, or where a provider that submitted the report to the EU Centre has indicated that the report is based on the information received from a hotline, the EU Centre shall coordinate with the relevant Coordinating Authorities in order avoid duplicated reporting on the same material that has already been reported to the national law enforcement authorities by the hotlines and monitor the removal of the child sexual abuse material or cooperate with the relevant hotline to track the status.
Amendment 636 #
Proposal for a regulation Article 55 – paragraph 1 – point d a (new) (d a) a Survivors‘ Advisory Board as an advisory group, which shall exercise the tasks set out in Article 66a (new).
Amendment 637 #
Proposal for a regulation Article 57 – paragraph 1 – point c (c) adopt rules for the prevention and management of conflicts of interest in respect of its members, as well as for the members of the Technological Committee and of
Amendment 638 #
Proposal for a regulation Article 57 – paragraph 1 – point f (f) appoint the members of the Technology Committee
Amendment 639 #
Proposal for a regulation Article 57 – paragraph 1 – point h a (new) (h a) consult the Survivors’ Advisory Board as regards the obligations referred to in points (a) and (h) of this Article.
Amendment 640 #
Proposal for a regulation Article 66 a (new) Amendment 641 #
Proposal for a regulation Article 83 – paragraph 1 – introductory part 1. Providers of hosting services, providers of publicly available number- independent interpersonal communications services and
Amendment 642 #
Proposal for a regulation Article 83 – paragraph 1 – point a – introductory part (a) where the provider has been subject to a
Amendment 643 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 1 — the measures taken to comply with the order,
Amendment 644 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 2 — the
Amendment 645 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 3 — in relation to complaints and cases submitted by users in connection to the measures taken to comply with the order, the number of complaints submitted directly to the provider, the number of cases brought before a judicial authority, the basis for those complaints and cases, the decisions taken in respect of those complaints and in those cases, the
Amendment 646 #
Proposal for a regulation Article 83 – paragraph 1 – point b (b) the number of removal orders issued to the provider in accordance with Article 14 and the average time
Amendment 647 #
Proposal for a regulation Article 83 – paragraph 1 – point b (b) the number of removal orders issued to the provider in accordance with Article 14 and the
Amendment 648 #
Proposal for a regulation Article 83 – paragraph 1 – point b a (new) (b a) the number and duration of delays to removals as a result of requests from competent authorities or law enforcement authorities;
Amendment 649 #
Proposal for a regulation Article 83 – paragraph 1 – point c (c) the total number of items of child sexual abuse material that the provider removed or to which it disabled access, broken down by whether the items were removed or access thereto was disabled pursuant to a removal order or to a notice submitted by a judicial authority, Competent Authority, the EU Centre
Amendment 650 #
Proposal for a regulation Article 83 – paragraph 1 – point c a (new) (c a) The number of instances the provider was asked to provide additional support to law enforcement authorities in relation to content that was removed;
Amendment 651 #
Proposal for a regulation Article 83 – paragraph 1 – point d Amendment 652 #
Proposal for a regulation Article 83 – paragraph 2 – introductory part 2. The Coordinating Authorities shall collect data on the following topics and make that information publicly available redacting operationally sensitive data as appropriate and proving an unredacted version to the EU Centre
Amendment 653 #
Proposal for a regulation Article 83 – paragraph 2 – point a – indent 4 a (new) - the nature of the report and its key characteristics such as if the security of the hosting service was allegedly breached;
Amendment 654 #
Proposal for a regulation Article 83 – paragraph 2 – point b (b) the most important and recurrent risks of online child sexual abuse encountered , as reported by providers of hosting services and providers of publicly available number -independent interpersonal communications services in accordance with Article 3 or identified through other information available to the Coordinating Authority;
Amendment 655 #
Proposal for a regulation Article 83 – paragraph 2 – point c (c) a list of the providers of hosting services and providers of interpersonal communications services to which the Coordinating Authority addressed a
Amendment 656 #
Proposal for a regulation Article 83 – paragraph 2 – point d (d) the number of
Amendment 657 #
Proposal for a regulation Article 83 – paragraph 2 – point f (f) the number of removal orders issued in accordance with Article 14, broken down by provider, the time needed to remove or disable access to the item or items of child sexual abuse material concerned, , including the time it took the Coordinating Authority to process the order and the number of instances in which the provider invoked Article 14(5) and (6);
Amendment 658 #
Proposal for a regulation Article 83 – paragraph 2 – point g Amendment 659 #
Proposal for a regulation Article 83 – paragraph 3 – introductory part 3. The EU Centre shall collect data and generate statistics on the
Amendment 660 #
Proposal for a regulation Article 83 – paragraph 3 – point a (a) the number of indicators in the databases of indicators referred to in Article 44 and the
Amendment 661 #
Proposal for a regulation Article 83 – paragraph 3 – point b (b) the number of submissions of child sexual abuse material and solicitation of children referred to in Article 36(1), broken down by Member State that designated the submitting Coordinating Authorities, and,
Amendment 662 #
Proposal for a regulation Article 83 – paragraph 3 – point c (c) the total number of reports submitted to the EU Centre in accordance with Article 12, broken down by provider of hosting services and provider of publicly available number-independent interpersonal communications services that submitted the report and by Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3);
Amendment 663 #
Proposal for a regulation Article 83 – paragraph 3 – point d (d) the online child sexual abuse to which the reports relate, including the number of items of potential
Amendment 664 #
Proposal for a regulation Article 83 – paragraph 3 – point e (e) the number of reports that the EU Centre considered unfounded or manifestly unfounded, as referred to in Article 48(2);
Amendment 665 #
Proposal for a regulation Article 83 – paragraph 3 – point f (f) the number of reports relating to potential
Amendment 666 #
Proposal for a regulation Article 83 – paragraph 3 – point h (h) where materially the same item of potential child sexual abuse material was reported more than once to the EU Centre in accordance with Article 12 or detected more than once through the searches in accordance with Article 49(1), the number of times that that item was reported or detected in that manner.
Amendment 667 #
Proposal for a regulation Article 83 – paragraph 4 4. The providers of hosting services, providers of interpersonal communications services and providers of internet access services, the Coordinating Authorities and the EU Centre shall ensure that the data
Amendment 668 #
Proposal for a regulation Article 83 – paragraph 5 5. They shall ensure that the data is stored in a secure manner and that the storage is subject to appropriate technical
Amendment 669 #
Proposal for a regulation Article 84 – paragraph 1 a (new) 1 a. The annual report shall also include the number of users affected by detection and removal orders.
Amendment 670 #
Proposal for a regulation Article 85 – paragraph 1 1. By [five years after the entry into force of this Regulation], and every five years thereafter, the Commission shall evaluate this Regulation and submit a report on its application to the European Parliament and the Council. This report shall address in particular the possible use of new technologies for a safe and trusted processing of personal and other data and for the purpose of combating online child sexual abuse and in particular to detect, report and remove online child sexual abuse. The report shall be accompanied, where appropriate, by a legislative proposal.
source: 745.291
2023/03/28
BUDG
37 amendments...
Amendment 57 #
Proposal for a regulation Recital 1 a (new) (1 a) The role of prevention should be emphasized by vesting children, parents and caregivers with the necessary instruments in order to develop situational awareness of the online environment, evaluate potential risks and support children in being safe online. In this regard, education facilities should have a greater role in contributing to this scope, reason for which civic education classes should also provide for the attainment of safe internet skills for children.
Amendment 58 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat online child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology- neutral and future-
Amendment 59 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question in a timely manner. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. .
Amendment 60 #
Proposal for a regulation Recital 1 a (new) (1 a) The role of prevention should be emphasized by vesting children, parents and caregivers with the necessary instruments in order to develop situational awareness of the online environment, evaluate potential risks and support children in being safe online. In this regard, education facilities should have a greater role in contributing to this scope, reason for which civic education classes should also provide for the attainment of safe internet skills for children.
Amendment 61 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat online child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology- neutral and future-
Amendment 62 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question in a timely manner. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. .
Amendment 63 #
Proposal for a regulation Recital 59 (59) To support the implementation of this Regulation and contribute to the achievement of its objectives, the EU Centre should serve as a central facilitator, carrying out a range of specific tasks. The performance of those tasks requires strong guarantees of independence, in particular from law enforcement authorities,
Amendment 64 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the
Amendment 65 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned. For this scope, the EU Centre can also aid in the implementation of awareness campaigns and contribute to the establishment and improvement of specific guidelines and proposals for mitigation measures respectively, so as to ensure accuracy and up to date solutions in tackling online child sexual abuse.
Amendment 66 #
Proposal for a regulation Recital 67 a (new) (67 a) In carrying out its mission, the EU Centre should also ensure transversal cooperation with education facilities, where appropriate, and digital education hubs, to also integrate this dimension of the prevention component, in order for children to become aware of the potential risks posed by the online environment.
Amendment 67 #
Proposal for a regulation Recital 67 b (new) (67 b) Considering the essential role teachers can play in guiding children on safely using information society services and detecting potentially malicious behaviour online, teacher training should be organized and implemented across the Union, in a coherent manner, benefitting from the knowledge and expertise of the EU Centre.
Amendment 68 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Furthermore, a special green line with a call centre assistance service will be constituted at EU level in order for victims and their families to receive support in a timely manner.
Amendment 69 #
Proposal for a regulation Recital 70 a (new) Amendment 70 #
Proposal for a regulation Recital 72 a (new) (72 a) In view of ensuring an adequate degree of expertise and skills for investigative purposes, specialized training of law enforcement officers will be introduced with the support of the EU Centre, especially considering rapid technological advancements where new methods, techniques and instruments require adapting preventive and mitigation efforts regarding online child sexual abuse.
Amendment 71 #
Proposal for a regulation Recital 74 a (new) (74 a) In view of the need for a more effective EU Centre it is necessary to establish a Survivors' Advisory Board.Through the structured involvement of victims and former victims of sexualised violence, the EU Centre should serve as a platform to offer holistic support for the fight against child sexual abuse in all Member States. The Survivors’ Advisory Council may support the EU Centre’s activities to facilitate cross-border cooperation for existing national networks and the exchange of best practice. It may also raise awareness for child sexual abuse by serving as a knowledge platform through the coordination, collection and synthethis of research.
Amendment 72 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise, provided the request is reasonably necessary to support the risk assessment. The requests shall not be seen as either administrative or economical burden for these enterprises.
Amendment 73 #
The Commission shall be empowered to adopt delegated acts as soon as possible in accordance with Article 86 in order to supplement this Regulation with the necessary detailed rules on the determination and charging of those costs and the application of the exemption for micro, small and medium-
Amendment 74 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, in a timely manner, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 75 #
Proposal for a regulation Article 21 – paragraph 2 – point 1 (new) Amendment 76 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point a (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(8), Article 4(5), Article 6(4) and Article 11, including by collecting and providing relevant information, expertise and best practices, taking into account advice from the Technology Committee and the Survivors’ Advisory Board referred to in Articles 66 and 66a (new);
Amendment 77 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a (a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51, including in view of updating guidelines on prevention and mitigation methods for combatting child sexual abuse, especially for the digital dimension as per new technological developments;
Amendment 78 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (b a) contribute to the implementation of awareness campaigns as per the potential risks posed by the online environment to children, in order to equip them with adequate skills for detecting potential grooming and deceit, to ensure safe use of the internet by children and to better implement the prevention component of online child sexual abuse;
Amendment 79 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) (b b) assisting with expertise and knowledge in the development and implementation of teacher training across the Union, in order to vest teachers with the necessary skills for guiding children on safely using information society services and detecting potentially malicious behaviour online;
Amendment 80 #
Proposal for a regulation Article 55 – paragraph 1 – point d a (new) (d a) a Survivors’ Advisory Board which shall exercise the tasks set out in Article 66a (new).
Amendment 81 #
Proposal for a regulation Article 56 – paragraph 4 4. Members of the Management Board and their alternates shall be appointed in the light of their knowledge in the field of combating child sexual abuse, taking into account relevant managerial, administrative and budgetary skills. Member States shall appoint a representative of their Coordinating Authority, within four months of [date of entry into force of this Regulation]. All parties represented in the Management Board shall make efforts to limit turnover of their representatives, in order to ensure continuity of its work. All parties sh
Amendment 82 #
Proposal for a regulation Article 57 – paragraph 1 – point c (c) adopt rules for the prevention and management of conflicts of interest in respect of its members, as well as for the members of the Technological Committee and of
Amendment 83 #
Proposal for a regulation Article 57 – paragraph 1 – point f (f) appoint the members of the Technology Committee, and of
Amendment 84 #
Proposal for a regulation Article 57 – paragraph 1 – point h a (new) (h a) consult the Survivors’ Advisory Board as regards the obligations referred to in points (a), (g) and (h) of this Article.
Amendment 85 #
Proposal for a regulation Article 64 – paragraph 4 – point f (f) preparing the Consolidated Annual Activity Report (CAAR) on the EU Centre’s activities, including the activities of the Technology Committee and the Survivors’ Advisory Board, and presenting it to the Executive Board for assessment and adoption;
Amendment 86 #
Proposal for a regulation Article 64 – paragraph 5 5. Where exceptional circumstances so require, the Executive Director may decide to locate one or more staff in another Member State for the purpose of carrying out the EU Centre’s tasks in an a more efficient, effective and coherent manner according to principles of good governance. Before deciding to establish a local office, the Executive Director shall obtain the prior consent of the Commission, the Management Board and the Member State concerned. The decision shall be based on an appropriate cost- benefit analysis that demonstrates in particular the added value of such decision and specify the scope of the activities to be carried out at the local office in a manner that avoids unnecessary costs and duplication of administrative functions of the EU Centre. A headquarters agreement with the Member State(s) concerned may be concluded.
Amendment 87 #
Proposal for a regulation Article 65 – paragraph 2 2. The Executive Director shall be appointed by the
Amendment 88 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical experts appointed by the Management Board in view of their excellence and their independence from corporate interests, following the publication of a call for expressions of interest in the Official Journal of the European Union.
Amendment 89 #
Proposal for a regulation Article 66 – paragraph 2 2. Procedures concerning the appointment of the members of the Technology Committee and its operation shall be further specified in the rules of procedure of the Management Board and shall be made public.
Amendment 90 #
Proposal for a regulation Article 66 – paragraph 4 4. When a member no longer meets the criteria of independence, he or she shall inform the Management Board. Alternatively, the Management Board may declare, on a proposal of at least one third of its members or of the Commission, a lack of independence and revoke the appointment of the person concerned. The Management Board shall appoint a new member for the remaining term of office in accordance with the procedure for ordinary members.
Amendment 91 #
Proposal for a regulation Article 66 – paragraph 6 – point b a (new) (b a) provide an annual acitvity report to the Exectuive Director as part of the Consolidated Annual Activity Report;
Amendment 92 #
Proposal for a regulation Article 66 a (new) Amendment 93 #
Proposal for a regulation Article 69 – paragraph 4 4. The EU Centre’s expenditure shall include staff remuneration, administrative and infrastructure expenses, and operating costs while following the appropriate EU budgetary rules.
source: 745.269
2023/05/08
FEMM
491 amendments...
Amendment 100 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available number-independent interpersonal communications services to execute
Amendment 101 #
Proposal for a regulation Recital 26 a (new) (26a) Detection of child sexual abuse in end-to-end encrypted communications is only possible by scanning those communications before they leave the abuser's device, however this would allow abusers to interfere with the scanning process. Abusers often work in groups, allowing for rapid proliferation of technology to bypass scanning, rendering such scanning ineffective. Therefore, taking into account the limited efficacy, and the negative impact on citizens' fundamental rights, detection orders should not be applicable to end-to-end encrypted communications.
Amendment 102 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the
Amendment 103 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to
Amendment 104 #
Proposal for a regulation Recital 28 (28) With a view to constantly assess the performance of the
Amendment 105 #
Proposal for a regulation Recital 29 (29)
Amendment 106 #
Proposal for a regulation Recital 30 (30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the
Amendment 107 #
Proposal for a regulation Recital 32 (32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited. For legal and practical reasons, it may not be reasonably possible to have those providers remove or disable access to the material, not even through cooperation with the competent authorities of the third country where they are established. Therefore, in line with existing practices in several Member States, it should be possible to require providers of internet access services to take reasonable measures to block the access of users in the Union to the material. However, blocking measures are easily bypassed, and do not prevent access from outside of the Union, meaning victims have to live knowing that abuse material depicting them remains online, therefore every effort should be taken to remove material, even outside of the jurisdiction of the Union, before resorting to blocking.
Amendment 108 #
Proposal for a regulation Recital 32 (32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited.
Amendment 109 #
Proposal for a regulation Recital 33 (33) In the interest of consistency, efficiency and effectiveness and to minimise the risk of circumvention, such
Amendment 110 #
Proposal for a regulation Recital 34 Amendment 111 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available interpersonal communications services in accordance with this Regulation. Online service providers, including social network platforms, should adopt mandatory procedures in order to effectively prevent, detect and report child sexual abuse that occurs on their services and remove child sexual abuse material
Amendment 112 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims or their approved formal representative should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported or has been removed by providers of hosting services or providers of publicly available interpersonal communications services in accordance with this Regulation. This should both include the option for a singular information request, as the option to receive this information on a continuous and regular basis.
Amendment 113 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted, whom to the vast majority are girls. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant and age-appropriate information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available interpersonal communications services in
Amendment 114 #
Proposal for a regulation Recital 35 a (new) Amendment 115 #
Proposal for a regulation Recital 36 (36) In order to prevent children falling victim to online abuse, providers for which there is evidence that their service is routinely or systematically used for the purpose of online child sexual abuse in line with article 3, should provide reasonable assistance, by putting in place alert and alarm mechanisms in a prominent way on their platforms. The alert mechanism could consist of, for example, linking potential victims to the local services such as helplines, victims` rights and support organisations or hotlines. They should ensure adequate follow-up, when a report or alert is made, in the language chosen by the user for using their service. Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the
Amendment 116 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. The holders of parental responsibility for the victims or the legal guardians of the victims should have equal legal standing to exercise victim's rights when the victim is not able to do so, due to age or other limitations. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. Such assistance should be tailored to the specific vulnerabilities of the victims, such as age, or disability, in a gender sensitive way. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities.
Amendment 117 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities, taking into account vulnerabilities and psychological effects on victims.
Amendment 118 #
Proposal for a regulation Recital 37 (37) To ensure the efficient management of such victim support functions, victims should be allowed to contact and rely on the Coordinating Authority that is most accessible to them, which should channel all communications between victims and the EU Centre. Coordinating authorities should provide gender- and age- sensitive support to victims, as well as psychological support. Under no circumstances should victims be blamed for what has happened to them.
Amendment 119 #
Proposal for a regulation Recital 37 a (new) (37a) Member States should ensure and safeguard the existence of effective mechanisms for reporting child sexual abuse and that such investigative tools are effectively used to identify victims and rescue them as quickly as possible from ongoing abuse;
Amendment 120 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to the application of this Regulation,
Amendment 121 #
Proposal for a regulation Recital 50 Amendment 122 #
Proposal for a regulation Recital 52 (52) To ensure effective enforcement and the safeguarding of users’ rights under this Regulation, it is appropriate to facilitate the lodging of complaints about alleged non-compliance with obligations o
Amendment 123 #
Proposal for a regulation Recital 55 (55) It is essential for the proper functioning of
Amendment 124 #
Proposal for a regulation Recital 55 a (new) (55a) All communications containing illegal material should be encrypted to state of the art standards, all access by staff to such content should be limited to what is necessary and thoroughly logged. All such logs should be stored for a minimum of ten years.
Amendment 125 #
Proposal for a regulation Recital 59 (59) To support the implementation of this Regulation and contribute to the achievement of its objectives, the EU
Amendment 126 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in
Amendment 127 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the
Amendment 128 #
Proposal for a regulation Recital 61 (61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material)
Amendment 129 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of hosting services for the dissemination of known child sexual abuse material that is publicly accessible, using the corresponding indicators. Where it identifies such material after having conducted such a search, the EU Centre should also be able to request the provider of the hosting service concerned to remove or disable access to the item or items in question, given that the provider may not be aware of their presence and may be willing to do so on a voluntary basis. The EU Centre must be able to work in collaboration with, and refer child victims to, relevant competent authorities and support services, such as victim protection centres, women’s shelters, children’s specialised services, social services, children’s rights organisations and family associations, as well as healthcare professionals in the Member States.
Amendment 130 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of hosting services for the dissemination of known child sexual abuse material that is publicly accessible, using the corresponding indicators. Where it identifies such material after having conducted such a search, the EU Centre should also be able to request the provider of the hosting service concerned to remove or disable access to the item or items in question, given that the provider may not be aware of their presence and may be willing to do so on a voluntary basis. The EU Centre should support Member States in conducting studies, with nationally representative samples, on child sexual abuse in their socialisation spaces, in order to structure preventive and multidisciplinary response measures.
Amendment 131 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as
Amendment 132 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and
Amendment 133 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned. The EU centre shall also provide knowledge, expertise and best practice on preventive measures targeted at abusers.
Amendment 134 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Child helplines are equally in the frontline in the fight against online child sexual abuse. Therefore, the EU Centre should also recognise the work of child helplines in victim response, and the existing referral mechanisms between child helplines and hotlines. The EU Centre should coordinate services for victims.
Amendment 135 #
Proposal for a regulation Recital 71 Amendment 136 #
Proposal for a regulation Recital 74 a (new) (74a) In view of the need for a more effective EU Centre it is necessary to establish a Survivors' Advisory Board. Through the structured involvement of victims and former victims of sexualised violence and experts on this matter, the EU Centre should serve as a platform to offer holistic support for the fight against child sexual abuse in all Member States. The Survivors’ Advisory Council may support the EU Centre’s activities to facilitate cross-border cooperation for existing national networks and the exchange of best practice. It may also raise awareness for child sexual abuse by serving as a knowledge platform through the coordination, collection and synthethis of research.
Amendment 137 #
Proposal for a regulation Recital 74 a (new) (74a) Given the purpose of this regulation, namely to combat and prevent child sexual abuse, the EU Centre should have a Children’s Rights and Survivors Advisory Board composed of experts, including specialist child psychiatrists and representatives of family associations, with an advisory function relating to children’s rights and the victims’ and survivors’ perspective. The Children’s Rights and Survivors Advisory Board may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate.
Amendment 138 #
Proposal for a regulation Recital 74 a (new) (74a) The Victims' Consultative Forum should be the EU Center's advisory body and support its work. Its principle function should be to provide independent advice through expertise knowledge, deriving from victims of sexual abuse online and taking into account the views of the children that will be consulted as well, in a child-friendly and child- sensitive manner on relevant issues.
Amendment 139 #
Proposal for a regulation Recital 75 (75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse data disaggregated by gender, age and social, cultural and economic background as well as information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question in gathering that information.
Amendment 140 #
Proposal for a regulation Recital 75 (75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect gender-disaggregated and age specific data, record and analyse information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and
Amendment 141 #
Proposal for a regulation Recital 77 (77) The evaluation should be based on the criteria of efficiency, necessity, effectiveness, proportionality, relevance, coherence and Union added value. It should assess the functioning of the different operational and technical measures provided for by this Regulation, including the effectiveness of measures to enhance the detection, reporting and removal of online child sexual abuse, the effectiveness of safeguard mechanisms as well as the impacts on potentially affected fundamental rights, children’s rights the freedom to conduct a business, the right to private life and the protection of personal data. The Commission should also assess the impact on potentially affected interests of third parties.
Amendment 142 #
Proposal for a regulation Recital 84 a (new) (84a) Recommends that companies operating social platforms and their security systems place greater emphasis on regulating the registration of children and minors on social media platforms, focusing particularly on the needs of people living in poverty, Roma and other minorities to combat differences in digital literacy and reduce the volume of violence in the online space.
Amendment 143 #
Proposal for a regulation Recital 84 b (new) (84b) Recommends that the EU centre should develop specific action plans in the field of digital education, focusing on children facing disadvantages and multiple disadvantages, and specifically targeting solutions to the digital divide.
Amendment 144 #
Proposal for a regulation Article 2 – paragraph 1 – point b a (new) (ba) 'safety assistant' means a tool integrated into interpersonal communications services either voluntarily or following a preventative detection order, and active only for child users of the service, which assists children in learning about, identifying and avoiding risks online, including but not limited to self-generated abuse material and solicitation;
Amendment 145 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 1
Amendment 146 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 1
Amendment 147 #
- ‘victim’ means the child or person having suffered harm caused after being subject to ‘child sexual abuse material’ or ‘solicitation of children’ or ‘online sexual abuse’ or ‘child sexual abuse offences’;
Amendment 148 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of
Amendment 149 #
Proposal for a regulation Article 3 – paragraph 1 a (new) Amendment 150 #
Proposal for a regulation Article 3 – paragraph 2 – point a Amendment 151 #
Proposal for a regulation Article 3 – paragraph 2 – point a a (new) (aa) any actual or foreseeable negative effects for the exercise of fundamental rights or possible breaches of EU law
Amendment 152 #
Proposal for a regulation Article 3 – paragraph 2 – point a b (new) (ab) the protection of end-to-end encryption if applicable to the service
Amendment 153 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 1 Amendment 154 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 Amendment 155 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 – functionalities enabling
Amendment 156 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 – functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily accessible and age-appropriate and that respect users’ privacy;
Amendment 157 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - the integration of tools such as safety assistants to prevent child sexual abuse online;
Amendment 158 #
Proposal for a regulation Article 3 – paragraph 2 – point c Amendment 159 #
Proposal for a regulation Article 3 – paragraph 2 – point d (d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, the design of their recommender systems and any other relevant algorithmic system and the impact thereof on that risk;
Amendment 160 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point i Amendment 161 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii Amendment 162 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii (ii) where the service is used by children, the different age groups of the child users and the risk of solicitation of children in relation to those age groups, as well as the risk of adults using the service for the purpose of solicitation of children;
Amendment 163 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 1 – enabling users to publicly search for other
Amendment 164 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 – enabling users to
Amendment 165 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 Amendment 166 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii a (new) Amendment 167 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 The provider may request the EU Centre to perform an analysis of
Amendment 168 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise
Amendment 169 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 3 Amendment 170 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – point a (a) for a service which is subject to a
Amendment 171 #
Proposal for a regulation Article 3 – paragraph 6 6. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments, trends reported by authorities, civil society organisations and victim support organisations, and to the manners in which the services covered by those provisions are offered and used.
Amendment 172 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of
Amendment 173 #
Proposal for a regulation Article 4 – paragraph 1 – point a (a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service
Amendment 174 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (aa) providing technical measures and tools that allow users, and in particular children, to manage their own privacy, visibility, reachability and safety , and that are set to the most secure levels by default;
Amendment 175 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (ab) new informing users, keeping in mind children’s needs, about external resources and services in the user’s region on preventing child sexual abuse, counselling by help-lines, information on victim support and educational resources provided by hotlines and child protection organisations;
Amendment 176 #
Proposal for a regulation Article 4 – paragraph 1 – point a c (new) (ac) New providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line
Amendment 177 #
Proposal for a regulation Article 4 – paragraph 1 – point a d (new) (ad) automatic mechanisms and interface design elements to inform users about external preventive intervention programmes without prejudice to the prohibition of profiling under Article 22 GDPR and the processing of sensitive data under Article 9 GDPR
Amendment 178 #
Proposal for a regulation Article 4 – paragraph 1 – point b (b)
Amendment 179 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations, hotlines or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
Amendment 180 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) Amendment 181 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (ca) without breaking, weakening, circumventing or otherwise undermining end-to-end encryption in the sense of people’s right to confidential communications;.
Amendment 182 #
Proposal for a regulation Article 4 – paragraph 2 – point a (a) effective and proportionate in mitigating the identified serious risk;
Amendment 183 #
Proposal for a regulation Article 4 – paragraph 2 – point a a (new) (aa) new subject to an implementation plan with clear objectives and methodologies for identifying and quantifying impacts on the identified serious risk and on the exercise of the fundamental rights of all affected parties. The implementation plan shall be reviewed every six months.
Amendment 184 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk, specific vulnerabilities of children online and offline including age, gender and disability, as well as the provider’s financial and technological capabilities and the number of users;
Amendment 185 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk , any impact on the functionality of the service as well as the provider’s financial and
Amendment 186 #
Proposal for a regulation Article 4 – paragraph 2 – point c (c) applied in a diligent and non- discriminatory manner,
Amendment 187 #
Proposal for a regulation Article 4 – paragraph 2 – point d a (new) (da) only introduced following an assessment of the risks the mitigating measures themselves pose for users, in particular if these risks would disproportionately negatively affect persons on the basis of sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender or sexual orientation;
Amendment 188 #
Proposal for a regulation Article 4 – paragraph 2 – point d b (new) (db) developed in cooperation with children who use the service;
Amendment 189 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 190 #
Proposal for a regulation Article 4 – paragraph 3 a (new) 3a. Any requirement to take specific measures shall be without prejudice to Article 8 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] and shall entail neither a general obligation for hosting services providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.
Amendment 191 #
Proposal for a regulation Article 4 – paragraph 4 4.
Amendment 192 #
Proposal for a regulation Article 4 – paragraph 4 a (new) 4a. Specific measures for platforms primarily used for the dissemination of pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure a. user-friendly reporting mechanisms to report alleged child sexual abuse material; b. adequate professional human content moderation to rapidly process notices of alleged child sexual abuse material; c. automatic mechanisms and interface design elements to inform users about external preventive intervention programmes in the user’s region.
Amendment 193 #
Proposal for a regulation Article 4 – paragraph 4 b (new) Amendment 194 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online and in the manners in which the services covered by those provisions are offered and used.
Amendment 195 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation
Amendment 196 #
Proposal for a regulation Article 4 – paragraph 5 a (new) 5a. To complement the risk mitigation measures taken by the providers, gender- sensitive and child-friendly education and prevention measures shall be implemented.
Amendment 197 #
Proposal for a regulation Article 6 Amendment 198 #
Proposal for a regulation Article 6 – paragraph 1 – point a (a) make reasonable efforts to
Amendment 199 #
Proposal for a regulation Article 6 – paragraph 1 – point b Amendment 200 #
Proposal for a regulation Article 6 – paragraph 1 – point b (b) take reasonable measures to prevent child users from accessing the software applications not intended for their use or adapted to their safety needs in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;
Amendment 201 #
Proposal for a regulation Article 6 – paragraph 1 – point c Amendment 202 #
Proposal for a regulation Article 6 – paragraph 1 a (new) 1a. Security of communications and services Nothing in this regulation shall be construed as requiring or encouraging the prohibition, restriction, circumvention or undermining of the provision or the use of encrypted services.
Amendment 203 #
Proposal for a regulation Article 6 – paragraph 2 Amendment 204 #
Proposal for a regulation Article 6 – paragraph 3 Amendment 205 #
Proposal for a regulation Article 6 – paragraph 4 Amendment 206 #
Proposal for a regulation Article 6 – paragraph 4 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online, and to the manners in which the services covered by those provisions are offered and used.
Amendment 207 #
Proposal for a regulation Article 6 a (new) Article6a Obligations concerning age verification and for software application stores 1. Providers of software application stores considered as gatekeepers under the Digital Markets Act (EU) 2022/1925 shall: (a) indicate if applications contain features that could pose a risk to children; (b) indicate if measures have been taken to mitigate risks for children, and which measures have been taken; (c) provide guidance for parents on how to discuss risks with their children; (d) provide application developers with an open-source software library that enables age verification requests from inside applications both to European Digital Identity Wallets and third-party services; (e) provide, free of charge, an age- verification service that can respond to age verification requests from inside applications. 2. Providers of European Digital Identity Wallets under the Regulation (EU) No XXX/2023 establishing a framework for a European Digital Identity shall ensure European Digital Identity Wallets can respond to age verification requests from applications without revealing the identity of the user. 3. Third-party age verification services used to fulfil the obligations of this article shall: (a) only retain user personal data for the purpose of fulfilling future requests, and with the explicit consent of the user; (b) Only retain data vital to process future verification request, namely: i. a pseudonymous means of authenticating the user; and ii. the users previously verified date of birth. (c) only use this data of the purpose of age verification; (d) fulfil requests for the deletion of this data pursuant to the GDPR; 4. Where Developers of applications have identified a significant risk of use of the service concerned for the purpose of the solicitation of children, they shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to put in place safeguards, namely: (a) take reasonable measures to mitigate the risk, such as adapting the services to children, integrating a safety assistant or modifying or adding safeguards limiting access to certain features; (b) provide children with guidance on risks that will help them identify dangers and make more informed decisions; (c) where the application is manifestly unsuitable for children and cannot be adapted, prevent access. 5. Age verification mechanisms set out in this article shall not be used for the purposes of enabling or facilitating parental control technologies that give access to children’s private communications without their consent.
Amendment 208 #
Proposal for a regulation Chapter II – Section 2 – title 2
Amendment 210 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent independent judicial authority of the Member State that designated it
Amendment 211 #
Proposal for a regulation Article 7 – paragraph 1 a (new) 1a. The Coordinating Authority of establishment shall choose one of the following types of detection order: (a). proactive detection orders, which detect and report known child sexual abuse material under the measures specified in Article 10; (b). preventative detection orders, which detect solicitation and attempts by children to share self-generated abuse material, and assist them in avoiding risks, under the measures specified in Article 10;
Amendment 212 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The Coordinating Authority of establishment shall, before requesting the issuance of a
Amendment 213 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 2 To that end, it may, where appropriate,
Amendment 214 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point a (a) establish a draft request for the issuance of a
Amendment 215 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point c Amendment 216 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – introductory part Where, having regard to the
Amendment 217 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the incident that the authority intends to investigate, the measures it envisages taking to execute the intended
Amendment 218 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b)
Amendment 219 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b) where the draft implementation plan concerns an intended detection order concerning the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct
Amendment 220 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the
Amendment 221 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessment and in order to
Amendment 222 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point d (d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted
Amendment 223 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the implementation plan of the provider and taking utmost account of the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the validation and issuance of the
Amendment 224 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part Amendment 225 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is evidence of a s
Amendment 226 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b (b) the reasons for issuing the
Amendment 227 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 a (new) (c) nothing in the investigation order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider as a whole.
Amendment 228 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 Amendment 229 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point a Amendment 230 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point b Amendment 231 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point c Amendment 232 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point d Amendment 233 #
Proposal for a regulation Article 7 – paragraph 5 – introductory part 5. As regards
Amendment 234 #
Proposal for a regulation Article 7 – paragraph 5 – point a (a) it is likely, despite any mitigation measures that the provider may have taken or will take, that the service is
Amendment 235 #
Proposal for a regulation Article 7 – paragraph 5 – point b (b) there is evidence of the service
Amendment 236 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 237 #
Proposal for a regulation Article 7 – paragraph 6 – introductory part 6. As regards
Amendment 238 #
Proposal for a regulation Article 7 – paragraph 6 – point a (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used,
Amendment 239 #
Proposal for a regulation Article 7 – paragraph 6 – point b (b) there is evidence of the service,
Amendment 240 #
Proposal for a regulation Article 7 – paragraph 6 – point c – point 1 (1) a
Amendment 241 #
Amendment 242 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – introductory part As regards preventative detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met:
Amendment 243 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – introductory part As regards
Amendment 244 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – point a (a) the provider qualifies as a provider of publicly available number-independent interpersonal communication services;
Amendment 245 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – point b (b) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used,
Amendment 246 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 – point c (c) there is evidence of the service,
Amendment 247 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 2 Amendment 248 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the judicial validation and the issuance of
Amendment 249 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that
Amendment 250 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point a (a) where th
Amendment 251 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point b (b) where necessary, in particular to limit such negative consequences, effective and proportionate safeguards additional to those listed in Article 10(4), (5)
Amendment 252 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 a (new) (d) nothing in the investigation order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider.
Amendment 253 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 254 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 2 The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the
Amendment 255 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 The period of application of
Amendment 256 #
Proposal for a regulation Article 8 – title 8 Additional rules regarding
Amendment 257 #
Proposal for a regulation Article 8 – title 8 Additional rules regarding
Amendment 258 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 259 #
Proposal for a regulation Article 8 – paragraph 1 – point a (a) information regarding the measures to be taken to execute the
Amendment 260 #
Proposal for a regulation Article 8 – paragraph 1 – point b (b) identification details of the competent judicial authority
Amendment 261 #
Proposal for a regulation Article 8 – paragraph 1 – point d (d) the specific service in respect of which the
Amendment 262 #
Proposal for a regulation Article 8 – paragraph 1 – point d a (new) (da) the type of detection order;
Amendment 263 #
Proposal for a regulation Article 8 – paragraph 1 – point e (e) whether the
Amendment 264 #
Proposal for a regulation Article 8 – paragraph 1 – point f (f) the start date and the end date of the
Amendment 265 #
Proposal for a regulation Article 8 – paragraph 1 – point g (g) a
Amendment 266 #
Proposal for a regulation Article 8 – paragraph 1 – point h (h) the factual and legal grounds justifying the issuing of the order, and a reference to this Regulation as the legal basis for the
Amendment 267 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 268 #
Proposal for a regulation Article 8 – paragraph 1 – point j (j) easily understandable information about the redress available to the addressee of the
Amendment 269 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 270 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 2 The
Amendment 271 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 3 The
Amendment 272 #
Proposal for a regulation Article 8 – paragraph 3 3. If the provider cannot execute the
Amendment 273 #
Proposal for a regulation Article 8 – paragraph 4 a (new) Amendment 274 #
Proposal for a regulation Article 8 – paragraph 4 b (new) 4b. Notification mechanism 1. Providers of hosting services and providers of interpersonal communication services shall establish mechanisms that allow users to notify to them the presence on their service of specific items or activities that the user considers to be potential child sexual abuse material, in particular previously unknown child sexual abuse material and solicitation of children. Those mechanisms shall be easy to access and user-friendly, child-friendly and shall allow for the submission of notices exclusively by electronic means. 2. Where the notice contains the electronic contact information of the user who submitted it , the provider shall without undue delay send a confirmation or receipt to the user. 3. Providers shall ensure that such notices are processed without undue delay.
Amendment 275 #
Proposal for a regulation Article 9 – title 9 Redress, information, reporting and modification of
Amendment 276 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services that have received a
Amendment 277 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the
Amendment 278 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 2 For the purpose of the first subparagraph, a
Amendment 279 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 1 Where the period of application of the
Amendment 280 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 2 Those reports shall include a detailed description of the measures taken to execute the
Amendment 281 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the
Amendment 282 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 283 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of interpersonal communication services that are not end-to end encrypted, and that have received a proactive detection order shall execute it by installing and operating technologies to detect the dissemination of known
Amendment 284 #
Proposal for a regulation Article 10 – paragraph 1 a (new) Amendment 285 #
Proposal for a regulation Article 10 – paragraph 1 b (new) 1b. 1b. Technologies used in preventative detection orders to detect grooming shall only report detection in cases where the potential victim, trusted adult, or moderator explicitly choose to. Where end-to-end encryption is used the detection should be done entirely on the users’ device.
Amendment 286 #
Proposal for a regulation Article 10 – paragraph 1 c (new) 1c. 1c. Technologies used in preventative detection orders to detect when children attempt to use their services to send intimate images shall not report these users in any way, Where end- to-end encryption is used the detection should be done entirely on the users’ device.
Amendment 287 #
Proposal for a regulation Article 10 – paragraph 1 d (new) 1d. 1d. The Coordinating Authority shall be empowered to request services take further preventative measures so long as those measures do not involve reporting, and only after approval by the relevant Data Protection Authority.
Amendment 288 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible, age-appropriate, gender-sensitive, and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
Amendment 289 #
Proposal for a regulation Article 11 – paragraph 1 The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online, and the manners in which the services covered by those provisions are offered and used.
Amendment 290 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of
Amendment 291 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 1 Where the provider submits a report pursuant to paragraph 1, it shall
Amendment 292 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 2 The provider shall
Amendment 293 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 3 Amendment 294 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible
Amendment 295 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible
Amendment 296 #
Proposal for a regulation Article 12 – paragraph 3 a (new) 3a. New possible child sexual abuse material reported by a user shall immediately be assessed to determine the probability that the material represent risk or harm to a child. If the potential online child sexual abuse on the service is flagged by a user known to be a child, the provider shall provide the child with essential information on online safety and specialist child support services, such as helplines and hotlines, in addition to the reporting of the material.
Amendment 297 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 298 #
Proposal for a regulation Article 13 – paragraph 1 – point c (c)
Amendment 299 #
Proposal for a regulation Article 13 – paragraph 1 – point d (d) a list of all available data other than content data related to the potential online child sexual abuse preserved in line with the preservation order in article 8a;
Amendment 300 #
Proposal for a regulation Article 13 – paragraph 1 – point d a (new) (da) a list of all traffic data and metadata retained in relation to the potential online child sexual abuse, which could be made available to law enforcement authorities, together with information concerning default storage periods.
Amendment 301 #
Proposal for a regulation Article 13 – paragraph 1 – point f Amendment 302 #
Proposal for a regulation Article 13 – paragraph 1 – point g Amendment 303 #
Proposal for a regulation Article 13 – paragraph 1 – point i (i) where the
Amendment 304 #
Proposal for a regulation Article 13 – paragraph 1 – point j (j) in indication whether the provider considers that the report requires urgent action;
Amendment 305 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of
Amendment 306 #
Proposal for a regulation Article 14 – paragraph 1 a (new) 1a. Before requesting a removal order, the Coordinating Authority of establishment and competent judicial authority shall take all reasonable steps to ensure that implementing the order will not interfere with activities for the investigation and prosecution of child sexual abuse offences.
Amendment 307 #
Proposal for a regulation Article 14 – paragraph 1 b (new) 1b. Removal orders shall be issued by judicial authorities in line with Article 9 on Orders to act against illegal content of the Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC.
Amendment 308 #
Proposal for a regulation Article 14 – paragraph 3 – introductory part 3. The competent judicial authority o
Amendment 309 #
Proposal for a regulation Article 14 – paragraph 3 – point a (a) identification details of the judicial
Amendment 310 #
Proposal for a regulation Article 14 – paragraph 3 – point c Amendment 311 #
Proposal for a regulation Article 14 – paragraph 3 – point h (h) the date, time stamp and electronic signature of the judicial
Amendment 312 #
Proposal for a regulation Article 14 – paragraph 3 – point i (i) easily understandable and accessible information about
Amendment 313 #
Proposal for a regulation Article 14 – paragraph 4 – subparagraph 1 The judicial authority
Amendment 314 #
Proposal for a regulation Article 14 – paragraph 5 – subparagraph 1 If the provider cannot execute the removal order on grounds of force majeure or de facto impossibility not attributable to it, including for objectively justifiable technical or operational reasons, it shall, without undue delay, inform the Coordinating Authority of establishment of those grounds including evidence, using the template set out in Annex V.
Amendment 315 #
Proposal for a regulation Article 14 – paragraph 6 – subparagraph 1 If the provider cannot execute the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, request the necessary clarification to the Coordinating Authority of establishment, using the template set out in Annex V. The Coordinating Authority shall reply without undue delay, at maximum within two days.
Amendment 316 #
Proposal for a regulation Article 15 – paragraph 1 1. Providers of hosting services or publicly available number-independent interpersonal communications services that have received a removal order issued in accordance with Article 14, as well as the users who
Amendment 317 #
Proposal for a regulation Article 15 – paragraph 2 – subparagraph 1 When the removal order becomes final, the competent judicial authority
Amendment 318 #
Proposal for a regulation Article 15 – paragraph 3 – point b (b) the reasons for the removal or disabling, providing a copy of the removal order upon
Amendment 319 #
Proposal for a regulation Article 15 – paragraph 3 – subparagraph 1 (new) Amendment 320 #
Proposal for a regulation Article 15 – paragraph 3 – point c a (new) (ca) if the user is a child, referral to competent national support services and essential information on online safety, in a child-friendly language.
Amendment 321 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 1 Amendment 322 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 1 The Coordinating Authority of establishment may request, when requesting the judicial authority or independent administrative authority issuing the removal order, and after having consulted with relevant public authorities, that the provider is not to disclose any information regarding the removal of or disabling of access to the child sexual abuse material, where and to the extent necessary to avoid interfering with activities for the
Amendment 323 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 2 Amendment 324 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 2 – point a Amendment 325 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 2 – point b Amendment 326 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 2 – point c Amendment 327 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 3 Amendment 330 #
Proposal for a regulation Article 16 – paragraph 1 Amendment 331 #
Proposal for a regulation Article 16 – paragraph 1 1.
Amendment 332 #
Proposal for a regulation Article 16 – paragraph 2 – subparagraph 1 Amendment 333 #
Proposal for a regulation Article 16 – paragraph 2 – subparagraph 2 Amendment 334 #
Proposal for a regulation Article 16 – paragraph 2 – subparagraph 2 – point a Amendment 335 #
Proposal for a regulation Article 16 – paragraph 2 – subparagraph 2 – point b Amendment 336 #
Proposal for a regulation Article 16 – paragraph 2 – subparagraph 2 – point c Amendment 337 #
Proposal for a regulation Article 16 – paragraph 2 – subparagraph 2 – point d Amendment 338 #
Proposal for a regulation Article 16 – paragraph 2 – subparagraph 2 – point d a (new) Amendment 339 #
Proposal for a regulation Article 16 – paragraph 2 – subparagraph 2 – point d b (new) (db) the coordinating authority, EU centre and national law enforcement organisations have taken all possible measures to have the content removed, including: i. contacting the hosting service where the material is stored in order to request removal; ii. contacting law enforcement in the country where the content is hosted to request their assistance in removing the material;
Amendment 340 #
Proposal for a regulation Article 16 – paragraph 3 Amendment 341 #
Proposal for a regulation Article 16 – paragraph 4 – subparagraph 1 Amendment 342 #
Proposal for a regulation Article 16 – paragraph 4 – subparagraph 1 – point a Amendment 343 #
Proposal for a regulation Article 16 – paragraph 4 – subparagraph 1 – point b Amendment 344 #
Proposal for a regulation Article 16 – paragraph 4 – subparagraph 1 – point c Amendment 345 #
Proposal for a regulation Article 16 – paragraph 4 – subparagraph 1 – point d Amendment 346 #
Proposal for a regulation Article 16 – paragraph 4 – subparagraph 2 Amendment 347 #
Proposal for a regulation Article 16 – paragraph 5 Amendment 348 #
Proposal for a regulation Article 16 – paragraph 5 – point a Amendment 349 #
Proposal for a regulation Article 16 – paragraph 5 – point b Amendment 350 #
Proposal for a regulation Article 16 – paragraph 6 – subparagraph 1 Amendment 351 #
Proposal for a regulation Article 16 – paragraph 6 – subparagraph 2 Amendment 352 #
Proposal for a regulation Article 16 – paragraph 6 – subparagraph 2 The period of application of blocking orders shall not exceed
Amendment 353 #
Proposal for a regulation Article 16 – paragraph 7 – subparagraph 1 Amendment 354 #
Proposal for a regulation Article 16 – paragraph 7 – subparagraph 2 Amendment 356 #
Proposal for a regulation Article 17 – paragraph 1 Amendment 357 #
Proposal for a regulation Article 17 – paragraph 1 – point a Amendment 358 #
Proposal for a regulation Article 17 – paragraph 1 – point b Amendment 359 #
Proposal for a regulation Article 17 – paragraph 1 – point c Amendment 360 #
Proposal for a regulation Article 17 – paragraph 1 – point d Amendment 361 #
Proposal for a regulation Article 17 – paragraph 1 – point e Amendment 362 #
Proposal for a regulation Article 17 – paragraph 1 – point f Amendment 363 #
Proposal for a regulation Article 17 – paragraph 1 – point f Amendment 364 #
Proposal for a regulation Article 17 – paragraph 1 – point g Amendment 365 #
Proposal for a regulation Article 17 – paragraph 1 – point h Amendment 366 #
Proposal for a regulation Article 17 – paragraph 1 – point i Amendment 367 #
Proposal for a regulation Article 17 – paragraph 2 Amendment 368 #
Proposal for a regulation Article 17 – paragraph 3 Amendment 369 #
Proposal for a regulation Article 17 – paragraph 4 Amendment 370 #
Proposal for a regulation Article 17 – paragraph 5 Amendment 371 #
Amendment 373 #
Proposal for a regulation Article 18 – paragraph 1 Amendment 374 #
Proposal for a regulation Article 18 – paragraph 2 – subparagraph 1 Amendment 375 #
Proposal for a regulation Article 18 – paragraph 2 – subparagraph 2 Amendment 376 #
Proposal for a regulation Article 18 – paragraph 3 Amendment 377 #
Proposal for a regulation Article 18 – paragraph 4 Amendment 378 #
Proposal for a regulation Article 18 – paragraph 4 – point a Amendment 379 #
Proposal for a regulation Article 18 – paragraph 4 – point b Amendment 380 #
Proposal for a regulation Article 18 – paragraph 4 – point c Amendment 381 #
Proposal for a regulation Article 18 – paragraph 5 – subparagraph 1 Amendment 382 #
Proposal for a regulation Article 18 – paragraph 5 – subparagraph 2 Amendment 383 #
Proposal for a regulation Article 18 – paragraph 6 Amendment 384 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be legally liable for child sexual abuse offences
Amendment 385 #
Proposal for a regulation Article 20 – title Victims’ right to information and access to support
Amendment 386 #
Proposal for a regulation Article 20 – title 20 Victims’ right to information and support
Amendment 387 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Amendment 388 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Amendment 389 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request or on the request of their legal guardian or legal representative, from the Coordinating Authority designated by the Member State where they reside, age-appropriate information regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them.
Amendment 390 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 a (new) Victims of child sexual abuse or their representatives and persons living in the Union shall have the right to receive, upon their request, from the Coordinating Authority information regarding victim’s rights, support and assistance. The information shall be age-appropriate, accessible and gender-sensitive and shall include at a minimum: (a) the type of support they can obtain and from whom, including, where relevant, basic information about access to medical support, any specialist support, including psychological or social support, and alternative accommodation; (b) the procedures for making complaints with regard to a criminal offence and their role in connection with such procedures; ( c) how and under what conditions they can obtain protection, including protection measures; (d) how and under what conditions they can access legal advice, legal aid and any other sort of advice; (e) how and under what conditions they can access compensation; (f) how and under what conditions they are entitled to interpretation and translation;
Amendment 391 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 a (new) Persons residing in the Union shall have the right to receive, upon their request or on the request of their legal guardian or legal representative, from the Coordinating Authority designated by the Member State where they reside, information about universal services and victim support services, taking into consideration their age and gender. Persons with disabilities shall have the right to request such information and receive it in a manner which is accessible to them.
Amendment 392 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 b (new) In case a victim or victim representative indicates the preference for a periodic request, the Coordinating Authority shall submit without delay, the information referred to in paragraph 3 proactively to the requester after the first submitted reply, in any new instances of reports referred to in paragraph 1 on a weekly basis. Victims or victim representatives can terminate the periodic request at any time by notifying the Coordinating Authority in question.
Amendment 393 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 2 That Coordinating Authority shall transmit the request to the EU Centre through the system established in accordance with Article 39(2) and shall communicate the results received from the EU Centre to the person making the request. The transmission of the request shall be made with due regard to the protection of the identity and the privacy of the victim, together with measures for the protection of the privacy and the images of their family members, in a victim sensitive or age-appropriate and gender-sensitive way. Such protection is particularly important for child victims and includes non- disclosure of the name of the child. A child-sensitive approach, taking due account of the child's age, maturity, views, needs and concerns, shall prevail. The child and the holder of parental responsibility or other legal representative, if any, shall be informed of their rights as victims. The Coordinating Authority shall also provide information to victims, regarding access to specialist support services available.
Amendment 394 #
Proposal for a regulation Article 20 – paragraph 2 – point b (b) where applicable, the individual or entity
Amendment 395 #
(c) sufficient elements to
Amendment 396 #
Proposal for a regulation Article 20 – paragraph 2 – subparagraph 1 (new) (d) an indication if the request is occasional or should cover a certain time period
Amendment 397 #
Proposal for a regulation Article 20 – paragraph 3 – point a (a) the identification of the provider(s) that submitted the report;
Amendment 398 #
Proposal for a regulation Article 20 – paragraph 3 – point b (b) the date(s) of the report(s);
Amendment 399 #
Proposal for a regulation Article 20 – paragraph 3 – point c (c) whether the EU Centre forwarded the report(s) in accordance with Article 48(3) and, if so, to which authorities;
Amendment 400 #
Proposal for a regulation Article 20 – paragraph 3 – point d (d) whether the provider reported having removed or disabled access to the material, and in case, including all related information, in accordance with Article 13(1), point (i).
Amendment 401 #
Proposal for a regulation Article 20.º – paragraph 3 – point d a (new) da information regarding age- appropriate and gender-sensitive victim support services to provide the child, family and survivors with adequate health, emotional and psychosocial support as well as practical and legal assistance.
Amendment 402 #
Proposal for a regulation Article 20 – paragraph 3 – subparagraph 1 (new) (e) if there were appeals to such removal and all related information (f) new relevant age-appropriate, accessible and gender-sensitive information on victim support and assistance in the victim’s region
Amendment 403 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, to
Amendment 404 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide
Amendment 405 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide
Amendment 406 #
Proposal for a regulation Article 21 – paragraph 1 a (new) 1a. Professionals likely to come into contact with victims of child sexual abuse shall be adequately trained to deal with such victims, taking into account gender sensitivities.
Amendment 407 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Amendment 408 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Amendment 409 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request or on the request of their legal guardian or legal representative, from the Coordinating Authority
Amendment 410 #
Proposal for a regulation Article 21 – paragraph 3 3. The requests referred to in paragraphs 1 and 2 shall indicate the relevant item or items of child sexual abuse material and any other relevant information.
Amendment 411 #
Proposal for a regulation Article 21 – paragraph 4 – point b (b) verifying whether and when the provider removed or disabled access to that item or those items, including by conducting the searches referred to in Article 49(1);
Amendment 412 #
Proposal for a regulation Article 21 – paragraph 4 – point d (d) where necessary, informing the Coordinating Authority of establishment of the presence of that item or those items on the provider’s service, with a view to the issuance of a removal order pursuant to Article 14 and the obligations under Article 21.
Amendment 413 #
Proposal for a regulation Article 21 – paragraph 4 – subparagraph 1 (new) (e) information regarding victim’s rights, assistance and support pursuant to Article 21.
Amendment 414 #
Proposal for a regulation Article 21 – paragraph 4 a (new) 4a. The EU Centre shall provide a “Take it Down” service which: allows victims to flag abuse material depicting them, and store a fingerprint of that material in a database and allows participating interpersonal communications services and hosting services, including social networks, to voluntarily check images uploaded to their platforms against this database. Participating services shall: (a). take the following measures when a match is found: (i). inform the uploader that the image they are attempting to upload has been identified as child sexual abuse material, and prevent upload; (ii). give the uploader the option to contest the flagging, forwarding the image and fingerprint on to the EU centre for further analysis; (iii). allow the uploader to provide further information to the EU Centre on the origin of the image. (b). state clearly that uploads are checked against a database of known abuse material; (c). provide anonymised statistics to the EU centre on the number of times an upload of an image with a certain hash was attempted.
Amendment 415 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 1 – introductory part Providers of hosting services and providers of interpersonal communications services shall preserve the necessary content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated
Amendment 416 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 1 – point b (b) reporting information concerning potential online child sexual abuse to the EU Centre pursuant to Article 12;
Amendment 417 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 2 As regards the first subparagraph, point (a), the provider may also preserve the information, including data on gender and age, for the purpose of improving the effectiveness and accuracy of the technologies to detect online child sexual abuse for the execution of a detection order issued to it in accordance with Article 7. However, it shall not store any personal data for that purpose.
Amendment 418 #
Proposal for a regulation Article 25 – paragraph 1 1. Member States shall, by [Date - two months from the date of entry into force of this Regulation], designate one or more competent authorities as responsible for the application and enforcement of this Regulation and to the achievement of the objective of this Regulation and enforcement of Directive 2011/93/EU (‘competent authorities’).
Amendment 419 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 2 The Coordinating Authority shall be responsible for all matters related to application and enforcement of this Regulation, and to the achievement of the objective of this Regulation and enforcement of Directive 2011/93/EU in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities.
Amendment 420 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 3 The Coordinating Authority shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective, efficient and consistent application and enforcement of this Regulation and Directive 2011/93/EU throughout the Union.
Amendment 421 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and communicate it to the EU Centre. They shall keep that information updated. The Coordinating Authority shall contribute with relevant information and material for the promotion of targeted child- sensitive awareness raising or education campaigns for children as well for adults about the risks of online child sexual abuse. Such contribution shall be based on the expertise and the feedback from the EU Centre and shall be made with a gender- sensitive perspective.
Amendment 422 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a sufficiently staffed contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation and enforcement of Directive 2011/93/EU in that Member State. Member States shall make the information on the contact point
Amendment 423 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information
Amendment 424 #
Proposal for a regulation Article 25 – paragraph 5 a (new) 5a. Each Member State shall ensure that a section or department is designated or established within the Coordinating Authority’s office, responsible for creating and disseminating information campaigns aimed at raising awareness amongst the public, especially children, with particular consideration of the gender and age of the potential recipients. The creation and dissemination process should be implemented in consultation and collaboration with the appropriate and competent national bodies.
Amendment 425 #
Proposal for a regulation Article 25 – paragraph 7 – point a (a) provide certain information or
Amendment 426 #
Proposal for a regulation Article 25 – paragraph 7 – point a a (new) (aa) providing information in terms of expertise and developed techniques for preventing online child abuse and the online dissemination of materials depicting sexual child abuse, with particular consideration of age and gender;
Amendment 427 #
Proposal for a regulation Article 25 – paragraph 7 – point a a (new) (aa) provide information and expertise on gender-sensitive and age appropriate victim support and prevention of online child sexual abuse
Amendment 428 #
Proposal for a regulation Article 25 – paragraph 7 – point a b (new) (ab) providing support in developing preventative measures, including: public- awareness-raising campaigns, programmes to improve digital skills and skills in terms of using the Internet and online safety, ensuring support and access to specialist services and support services for child sexual abuse victims and children in difficult situations.
Amendment 429 #
Proposal for a regulation Article 25 – paragraph 7 – point c (c) verify the possible need to request competent
Amendment 430 #
Proposal for a regulation Article 25 – paragraph 7 – point d (d) verify the effectiveness of a
Amendment 431 #
Proposal for a regulation Article 25 – paragraph 8 8. The EU Centre shall provide such assistance free of charge and in accordance with its tasks and obligations under this Regulation and insofar as its resources
Amendment 432 #
Proposal for a regulation Article 25 – paragraph 8 a (new) 8a. The EU Centre shall support Member States in designing age- appropriate and gender-appropriate preventive measures aimed at children, as well as their parents or legal representatives, such as organising awareness-raising campaigns to combat child sexual abuse and delivering relationships and sex education tailored to the age of children within the school framework, in collaboration with their parents or legal representatives, or with their consent.
Amendment 433 #
Proposal for a regulation Article 25 – paragraph 8 a (new) 8a The EU Centre should support Member States in the introduction and implementation in formal education of sex education programmes, that promote the empowerment of children to be agents of their own protection: concepts of the body and its boundaries, concepts of intimacy and privacy, and the ability to verbalise and state what happened to those to whom they are closest emotionally or to whom they feel more connected.
Amendment 434 #
Proposal for a regulation Article 25 – paragraph 9 a (new) 9a. When communicating with or making decisions affecting the victims or persons in high-risk groups, the coordinating body should fully respect human and civil rights of dignity and privacy, as well as take into consideration the gender and age of the victim or party involved.
Amendment 435 #
Proposal for a regulation Article 26 – paragraph 1 Amendment 436 #
Proposal for a regulation Article 26 – paragraph 2 – point a (a) are legally and functionally independent from
Amendment 437 #
Proposal for a regulation Article 26 – paragraph 2 – point e Amendment 438 #
Proposal for a regulation Article 26 – paragraph 3 a (new) 3a. Paragraph 2 shall not prevent the Coordinating Authority of any membership in a recognised international network as it shall not prejudice its independent character;
Amendment 439 #
Proposal for a regulation Article 26 – paragraph 4 4. The Coordinating Authorities shall ensure that relevant members of staff have the required qualifications, experience and technical skills to perform their duties. They shall also ensure that members of staff coming into contact with victims are adequately and frequently trained in intersectional victim support.
Amendment 440 #
Proposal for a regulation Article 26 – paragraph 4 a (new) 4a. The Coordinating Authorities shall ensure that the appointment of management and hiring of staff is subject to an employment background check,
Amendment 441 #
Proposal for a regulation Article 26 – paragraph 5 5. The management and other staff of the Coordinating Authorities shall, in accordance with Union or national law, be subject to a duty of professional secrecy both during and after their term of office, with regard to any confidential information which has come to their knowledge in the course of the performance of their tasks. Member States shall ensure that the management and other staff are subject to rules guaranteeing that they can carry out their tasks in an objective, impartial and independent manner, in particular as regards their appointment, dismissal, remuneration and career prospects. Coordinating Authorities shall take into account the application of Directive 2021/93/EU on Pay Transparency.
Amendment 442 #
Proposal for a regulation Article 34 – paragraph 1 1. Users shall have the right to lodge a complaint
Amendment 443 #
Proposal for a regulation Article 34 – paragraph 1 a (new) 1a. The Coordinating authority shall offer easy to use mechanisms to anonymously submit information about infringements of this Regulation"
Amendment 444 #
Proposal for a regulation Article 34 – paragraph 2 2. Coordinating Authorities shall provide
Amendment 445 #
Proposal for a regulation Article 34 – paragraph 3 – subparagraph 1 a (new) Users shall have the right to be informed of the outcome of the complaint.
Amendment 446 #
Proposal for a regulation Article 39 – paragraph 1 1. Coordinating Authorities shall cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and other relevant Union agencies, including Europol, to facilitate the performance of their respective tasks under this Regulation and ensure its effective, efficient and consistent application and enforcement. Coordinating Authorities shall establish systematic practises on the exchange of information and best practices related to the prevention and combating of online child sexual abuse and solicitation of children.
Amendment 447 #
Proposal for a regulation Article 40 – paragraph 2 2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information, gender and age-disggregated statistics, and expertise and facilitate cooperation and sharing of best practices between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
Amendment 448 #
Proposal for a regulation Article 43 – title Tasks of the EU
Amendment 449 #
Proposal for a regulation Article 43 – paragraph -1 (new) -1 The objective of the Agency shall be to provide the relevant institutions, bodies, offices and agencies of the EU and its Member States as well as civil society organisations and research bodies when involved with implementing EU law with assistance, expertise and coordination in relation to the preventing and combating of child sexual abuse, in order to support them when taking measures or formulating courses of action within their respective spheres of competence with full respect of fundamental rights
Amendment 450 #
Proposal for a regulation Article 43 – paragraph 1 – introductory part The EU
Amendment 451 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point a (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(8), Article 4(5), Article 6(4) and Article 11, including by collecting and providing relevant gender-sensitive and age-disaggregated information, expertise and best practices, taking into account advice from the Technology Committee and the Survivor’s Advisory Board referred to in Article 66 and 66a (new);
Amendment 452 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point a (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(8), Article 4(5), Article 6(4) and Article 11, including by collecting and providing relevant sex disaggregated information, expertise and best practices, taking into account advice from the Technology Committee referred to in Article 66;
Amendment 453 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point b (b) upon request from a provider of relevant information services, providing an analysis of
Amendment 454 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 – introductory part (2) facilitate the
Amendment 455 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 – point a Amendment 456 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 – point b (b) maintaining and operating the databases of indicators
Amendment 457 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 – point c (c) giving providers of hosting services and providers of interpersonal communications services that received a
Amendment 458 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 – point d (d) making technologies available to providers for the execution of
Amendment 459 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point b Amendment 46 #
Proposal for a regulation Citation 6 a (new) Having regard to the complimentary impact assessment37aof the European Parliament, _________________ 37a PE 740.248 https://www.europarl.europa.eu/RegData/ etudes/STUD/2023/740248/EPRS_STU(2 023)740248_EN.pdf
Amendment 460 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point c Amendment 461 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point d (d) providing information, assistance and support to victims in accordance with Articles 20 and 21;
Amendment 462 #
Proposal for a regulation Article 43 – paragraph 1 – point 5 – introductory part (5) support the Coordinating
Amendment 463 #
Proposal for a regulation Article 43 – paragraph 1 – point 5 – point f (f) providing information to Coordinating Authorities, upon their request or on its own initiative, relevant for the performance of their tasks under this Regulation, including by informing the Coordinating Authority of establishment of potential infringements identified in the performance of the EU
Amendment 464 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – introductory part (6) facilitate the generation
Amendment 465 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – introductory part (6) facilitate the generation and sharing of knowledge and best practices with other Union institutions, bodies, offices and agencies, Coordinating Authorities or other relevant authorities of the Member States to contribute to the achievement of the objective of this Regulation, by:
Amendment 466 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a (a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise, with particular consideration of age and gender, on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51;
Amendment 467 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a (a) collecting, recording, analysing and providing gender and age disaggregated data and information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse and victim support, in accordance with Article 51;
Amendment 468 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research and expertise
Amendment 469 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of gender-responsive research and expertise on those matters and on gender and age sensitive assistance to victims, including by serving as a hub of expertise to support evidence-based policy and by linking researchers to practitioners;
Amendment 47 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Over the last 20 years there has been a dangerous rise in child sexual abuse material (CSAM) following the growing technological development and connectivity. The cases of online CSAM and the frequency of grooming activities, targeting younger children and especially girls, is increasing drastically 1a. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well-
Amendment 470 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research and expertise on those matters and on assistance to victims in a gender sensitive and age specific way, including by serving as a hub of expertise to support evidence-based policy;
Amendment 471 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (ba) contributing to awareness raising campaigns that are gender-targeted within the European Union and among Member States.
Amendment 472 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) (bb) facilitating the exchange of best practices among Coordinating Authorities. All those tasks shall be performed in the best interests of the children.
Amendment 473 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b c (new) (bc) supporting the production of age- appropriate and gender-sensitive educational material.
Amendment 474 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point c a (new) (ca) Establish mechanisms to listen to and incorporate the views of children in its work, in accordance with the UNCRC, the Directive 2012/29/EU and the Charter of Fundamental Rights of the European Union.
Amendment 475 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 a (new) 6a) When communicating with or making decisions affecting the victims or persons in high-risk groups, the EU Centre should fully respect human and civil rights of dignity and privacy, as well as take into consideration the gender and age of the victim or party involved.
Amendment 476 #
Proposal for a regulation Article 43 – paragraph 1 – subparagraph 1 (new) (7) refer victims to the appropriate bodies and services for relevant victim support and assistance according to their needs. (8) set up a public anonymous reporting service for reports concerning child sexual abuse material for all persons in the Union
Amendment 477 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 a (new) (6a) facilitate and coordinate cooperation, including information sharing, with international law enforcement organisations, law enforcement authorities in third countries, in respect of Data Protection rules;
Amendment 478 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 1 The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10
Amendment 479 #
The EU
Amendment 48 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. While the proportion of child sexual abuse material that affects boys is growing, Child sexual abuse has a disproportionate impact on girls as the vast majority of child sexual abuse material is depicting girls, and girls are overrepresented in cases of solicitation of children, while men are overrepresented as perpetrators. According to reports, 96% of child sexual abuse material is estimated in 2021 to have affected girls. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well-
Amendment 480 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 2 To that aim, the EU
Amendment 481 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 3 Before including specific technologies on those lists, the EU
Amendment 482 #
Proposal for a regulation Article 50 – paragraph 2 – introductory part 2. The EU
Amendment 483 #
Proposal for a regulation Article 50 – paragraph 2 – introductory part 2. The EU Centre shall collect, record, analyse and make available
Amendment 484 #
Proposal for a regulation Article 50 – paragraph 2 – introductory part 2. The EU Centre shall collect, record, analyse and make available relevant, objective, reliable and comparable information on matters related to the prevention and education or awareness raising campaigns for combating
Amendment 485 #
Proposal for a regulation Article 50 – paragraph 2 – point a (a) information obtained in the performance of its tasks under this
Amendment 486 #
Proposal for a regulation Article 50 – paragraph 2 – point c (c) information resulting from research or other activities conducted by Member States’ authorities, other Union institutions, bodies, offices and agencies, the competent authorities of third countries, international organisations, research centres and civil society organisations including hotlines.
Amendment 487 #
Proposal for a regulation Article 50 – paragraph 2 – subparagraph 1 (new) (d) information obtained in the performance of its tasks under this Regulation concerning victim assistance and support
Amendment 488 #
3. Where necessary for the performance of its tasks under this Regulation, the EU Centre shall carry out, participate in or encourage and support research, surveys and studies, taking into consideration age and gender, either on its own initiative or, where appropriate and compatible with its priorities and its annual work programme, at the request of the European Parliament, the Council or the Commission.
Amendment 489 #
Proposal for a regulation Article 50 – paragraph 3 3. Where necessary for the performance of its tasks under this Regulation, the EU
Amendment 49 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), and to protect society at large.
Amendment 490 #
Proposal for a regulation Article 50 – paragraph 4 4. The EU
Amendment 491 #
Proposal for a regulation Article 50 – paragraph 5 5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations, public authorities and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. Communication campaigns shall be designed based on the latest information available, formulated together with specialised experts or psychologists, adapted to the children and in a way that is easy to understand for the children. Those campaigns will take into account the advice of the Victims' Consultative Forum.
Amendment 492 #
Proposal for a regulation Article 50 – paragraph 5 5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. All EU Centre actions should take particular account of age, should be adapted to the specifics and variable requirements of the gender of the recipients, and should fully respect personal dignity and privacy.
Amendment 493 #
Proposal for a regulation Article 50 – paragraph 5 5. The EU
Amendment 494 #
Proposal for a regulation Article 54 – paragraph 1 1. Where necessary for the performance of its tasks under this Regulation, the EU
Amendment 495 #
Proposal for a regulation Article 54 – paragraph 1 1. Where necessary for the performance of its tasks under this Regulation, the EU Centre
Amendment 496 #
Proposal for a regulation Article 54 – paragraph 2 2. The EU
Amendment 497 #
Proposal for a regulation Article 55 – paragraph 1 – introductory part The administrative and management structure of the EU
Amendment 498 #
Proposal for a regulation Article 55 – paragraph 1 – subparagraph 1 (new) (f) a Survivors’ Advisory Board which shall exercise the tasks set out in Article 66b.
Amendment 499 #
Proposal for a regulation Article 56 – paragraph 1 1. The Management Board shall be composed of one representative from each Member State
Amendment 50 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in
Amendment 500 #
Proposal for a regulation Article 56 – paragraph 1 1. The Management Board shall be composed of one representative from each Member State
Amendment 501 #
Proposal for a regulation Article 56 – paragraph 1 – subparagraph 1 (new) Amendment 502 #
Proposal for a regulation Article 56 – paragraph 2 – subparagraph 1 Amendment 503 #
Proposal for a regulation Article 56 – paragraph 2 – subparagraph 1 Amendment 504 #
Proposal for a regulation Article 56 – paragraph 2 – subparagraph 2 Europol
Amendment 505 #
Proposal for a regulation Article 56 – paragraph 3 3. Each member of the Management Board shall have an alternate. The alternate shall represent the member in
Amendment 506 #
Proposal for a regulation Article 56 – paragraph 3 a (new) 3a. The Chair and the Deputy Chair of the Management Board shall be elected by all representatives with a simple majority with a term of four years, with one possible re-election for another term.
Amendment 507 #
Proposal for a regulation Article 56 – paragraph 4 4. Members of the Management Board and their alternates shall be appointed in the light of their
Amendment 508 #
Proposal for a regulation Article 56 – paragraph 4 4. Members of the Management Board and their alternates shall be appointed in the light of their knowledge in the field of combating child sexual abuse, taking into account objective and neutral sexual criteria, relevant managerial, administrative and budgetary skills. Member States shall appoint a representative of their Coordinating Authority, within four months of [date of entry into force of this Regulation]. All
Amendment 509 #
Proposal for a regulation Article 57 – paragraph 1 – point a (a) give the general orientations for the EU
Amendment 51 #
Proposal for a regulation Recital 1 a (new) Amendment 510 #
Proposal for a regulation Article 57 – paragraph 1 – point c (c) adopt transparency rules for the prevention and management of conflicts of interest in respect of its members, as well as for the members of the Technological Committee and of
Amendment 511 #
Proposal for a regulation Article 57 – paragraph 1 – point f (f) appoint the members of the Technology Committee, the Victims' Consultative Forum and of any other advisory group it may establish for serving its purposes;
Amendment 512 #
Proposal for a regulation Article 57 – paragraph 1 – point f (f) appoint the members of the Technology Committee, and
Amendment 513 #
Proposal for a regulation Article 57 – paragraph 1 – point g Amendment 514 #
Proposal for a regulation Article 57 – paragraph 1 – subparagraph 1 (new) (i) consult the Survivors’ Advisory Board as regards the obligations referred to in points (a), and (h) of this Article.
Amendment 515 #
Proposal for a regulation Article 59 – paragraph 3 3. The Management Board shall hold at least two ordinary meetings a year. In addition, it shall meet on the initiative of its Chairperson, at the request of the Commission, or at the request of at least one-third of its members. The Management Board may invite the members of the Victims' Consultative Forum at least twice a year.
Amendment 516 #
Proposal for a regulation Article 61 – paragraph 1 – subparagraph 1 The Executive Board shall be composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and
Amendment 517 #
Proposal for a regulation Article 61 – paragraph 2 2. The term of office of members of the Executive Board shall be four years. In the course of the 12 months preceding the end of the four-year term of office of the Chairperson and five members of the Executive Board, the Management Board or a smaller committee selected among Management Board members including a Commission representative shall carry out an assessment of performance of the Executive Board. The assessment shall take into account an evaluation of the Executive Board members’ performance and the EU
Amendment 518 #
Proposal for a regulation Article 64 – paragraph 4 – point e a (new) (ea) implementing gender mainstreaming and gender budgeting in all areas, including drafting a gender action plan (GAP).
Amendment 519 #
Proposal for a regulation Article 64 – paragraph 4 – point f (f) preparing the Consolidated Annual Activity Report (CAAR) on the EU
Amendment 52 #
Proposal for a regulation Recital 1 a (new) (1a) In order to effectively prevent online child sexual abuse, Member States are called upon to strengthen preventive measures, including conducting awareness-raising campaigns for parents and educators, delivering relationships and sex education that is appropriate for the age of the children in collaboration with parents or with their consent, delivering training in the use of digital tools, in particular including education on the sensible use of screens and sounding the alarm about the dangers of children being exposed to pornography. Member States should ensure they make specialised age-appropriate support services available to abuse victims. Member States are called upon to combat the culture of impunity that may be result from an ineffective and lax judicial system and not to tolerate the sexualisation of children in the media or in artistic culture.
Amendment 520 #
Proposal for a regulation Article 64 – paragraph 4 – point g (g) preparing an action plan following- up conclusions of internal or external audit reports and evaluations, as well as investigations by the European Anti-Fraud Office (OLAF) and by the European Public
Amendment 521 #
Proposal for a regulation Article 64 – paragraph 4 – point p (p) fostering recruitment of appropriately skilled and experienced EU Centre staff,
Amendment 522 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical and online sexual abuse material experts appointed by the Management Board in view of their excellence and their independence from corporate interests, following the publication of a call for expressions of interest in the Official Journal of the European Union. Its members shall be appointed for a term of four years, renewable once. On the expiry of their term of office, members shall remain in office until they are replaced or until their appointments are renewed. If a member resigns before the expiry of his or her term of office, he or she shall be replaced for the remainder of the term by a member appointed by the Management Board. The Members shall be gender- balanced in its composition.
Amendment 523 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical experts appointed by the Management Board in view of their expertise and skills, taking into account objective and neutral sexual criteria and in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union.
Amendment 524 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical experts appointed by the Management Board in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union, as well as a representative from the European Data Protection Board.
Amendment 525 #
Proposal for a regulation Article 66 – paragraph 4 4. When a member no longer meets the criteria of independence, he or she shall inform the Management Board. Alternatively, the Management Board may declare, on a proposal of at least one third of its members or of the Commission, a lack of independence and revoke appointment of the person concerned. The Management Board shall appoint a new member for the remaining term of office in accordance with the procedure for ordinary members.
Amendment 526 #
Proposal for a regulation Article 66 a (new) Amendment 527 #
Proposal for a regulation Article 66 a (new) Article66a Establishment and tasks of the Victims’ Consultative Forum 1. The EU Centre shall establish a Victims' Consultative Forum to assist it by providing it with independent advice on victims related matters. The Consultative Forum will act upon request of the Management Board or the Executive Director of the EU Centre. 2. The Victims' Consultative Forum shall consist of a maximum of fifteen members. Members of the Victims' Consultative Forum shall be appointed by the Management Board and will be called to provide advice at least twice per year. They will include victims of child sexual abuse and exploitation, both online and offline, as well as representatives of organisations acting in the public interest against child sexual abuse and promoting victims’ rights. They shall be appointed following the publication of a call for expression of interest in the Official Journal of the European Union. 3. The Victims' Consultative Forum shall: a) provide the Management Board and the Executive Director with advice on matters related to victims on an age and gender -appropriate manner; b) contribute to the EU Centre communication strategy referred to in Article 50(5); c) provide its opinion and expertise on the technologies used to detect online child sexual abuse regarding their relevance to the conditions in which child sexual abuse is committed; d) maintain an open dialogue with the Management Board and the Executive Director on all matters related to victims, particularly on the protection of victims’ rights, taking into account specific factors such as the age, gender and disability of victims; e)gather, including through consultation and participation of children, the views and perspectives of children on specific issues of relevance; f) contribute to the EU wide raising awareness campaigns by providing related material and information.
Amendment 528 #
Proposal for a regulation Article 67 – paragraph 1 1. Each year the Executive Director shall draw up a draft statement of estimates of the EU
Amendment 529 #
Proposal for a regulation Article 69 – paragraph 4 4. The EU Centre’s expenditure shall include staff remuneration, administrative and infrastructure expenses, and operating costs, including the operating costs of the Technology Committee, the Victims' Consultative Forum and of any other advisory group it may establish for serving its purposes.
Amendment 53 #
Proposal for a regulation Recital 1 a (new) (1a) A growing number of teenagers are sharing intimate images, despite this being prohibited in a majority of member states. The implementation of measures to detect new abuse material would inevitably flag all such images as abuse material, resulting in a large number of false positives, but also in the investigation of those teenagers. This would significantly infringe on children’s right to privacy, as guaranteed by the Charter of Fundamental Rights of the European Union (‘Charter’), and in line with the United Nations Convention on the Rights of the Child (UNCRC), which has been ratified by all Member States. It would also result in stigma that disproportionately affects girls37b. Therefore services should warn children about the risks of sharing images, and give them guidance on what to do if they do so and something goes wrong. _________________ 37b The outcomes of sexting for children and adolescents: A systematic review of the literature https://doi.org/10.1016/j.adolescence.2021 .08.009
Amendment 530 #
4. The EU
Amendment 531 #
Proposal for a regulation Article 69 – paragraph 5 a (new) 5a. The budget shall comply with the principle of gender mainstreaming and the practise of gender budgeting shall be implemented.
Amendment 532 #
Proposal for a regulation Article 71 – paragraph 2 2. The Executive Board, in agreement with the Commission, shall adopt the necessary implementing measures, in accordance with the arrangements provided for in Article 110 of the Staff Regulations
Amendment 533 #
Proposal for a regulation Article 71 – paragraph 3 3. The EU
Amendment 534 #
Proposal for a regulation Article 72 – paragraph 1 1. The EU
Amendment 535 #
(c) the total number of items of child sexual abuse material when possible age and sex disaggregated that the provider removed or to which it disabled access, broken down by whether the items were removed or access thereto was disabled pursuant to a removal order or to a notice submitted by a Competent Authority, the EU Centre or a third party or at the provider’s own initiative;
Amendment 536 #
Proposal for a regulation Article 83 – paragraph 2 – point a – indent 2 – where the report led to the launch of a criminal investigation or contributed to an ongoing investigation, the state of play or outcome of the investigation, including whether the case was closed at pre-trial stage, whether the case led to the imposition of penalties, whether victims were identified and rescued and if so their numbers differentiating by
Amendment 54 #
Proposal for a regulation Recital 1 a (new) (1a) Service providers must ensure that the online environment protects users, especially children, and must prevent publications that may automatically display content liable to foster the development of addictions, for instance to pornography, drugs or gaming, or aggressive content that manipulates or coerces users’ free will or decision- making abilities, particularly among minors, who are in the midst of the personal development process.
Amendment 55 #
Proposal for a regulation Recital 1 b (new) Amendment 56 #
Proposal for a regulation Recital 1 b (new) (1b) Often, teenagers are manipulated into sharing images, or consensually share images which are later shared without their consent. This proposal should provide teenagers with tools to help prevent images from being shared without their consent, in particular through the possibility to submit the image to an EU "take it down" service, which prevents the image from being uploaded to social media websites.
Amendment 57 #
Proposal for a regulation Recital 1 c (new) (1c) The use of software to detect solicitation of children is insufficiently accurate, which means it could result in false positives, or could inadvertently flag child-to-child communications. This poses significant risks, in particular to LGBTQI+ children in hostile households.
Amendment 58 #
Proposal for a regulation Recital 1 d (new) (1d) The combined risks of attempting to detect unknown abuse material and solicitation pose a significant risk to children, and these technologies are also vulnerable to being bypassed by abusers, rendering them ineffective, therefore this legislation should focus on detecting known content, as well as flagging potential solicitation to the child user in an age-appropriate manner, and reducing creation and sharing of self-generated material.
Amendment 59 #
Proposal for a regulation Recital 2 (2) Given the central importance of
Amendment 60 #
Proposal for a regulation Recital 2 a (new) (2a) Abuse, exploitation and sexual violence against children are becoming more and more commonplace due to fast and advanced ICT (such as webcams, live streaming, social media platforms or computer games); there is a differentiation between genders in terms of online child sexual abuse, where girls are the main target group and are two to three times more vulnerable to sexual abuse than boys; we should note that statistics on the abuse of boys are often underestimated and such cases are less frequently reported;
Amendment 61 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, may have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
Amendment 62 #
Proposal for a regulation Recital 3 a (new) (3a) In order to effectively prevent child sexual abuse, Member States should: plan intervention by public and community bodies in the prevention of child exploitation and sexual abuse; implement effective intervention measures aimed at preventing risks of acts of sexual exploitation and sexual abuse against children; organise specific education campaigns on the protection and rights of children; implement actions to disseminate administrative measures, policies and social programmes with the aim of preventing the occurrence of acts of child sexual exploitation and sexual abuse; develop public awareness programmes through the media, on the phenomenon of sexual exploitation and on child sexual abuse, having regard to age and gender; ensure the promotion of policies to prevent child sexual exploitation and sexual abuse, particularly in the areas of Justice, Education, Health and Social Action; establish and disseminate effective social programmes to support victims, their families and any person to whom they are entrusted; enhance active social responses and multidisciplinary structures aimed at providing support to victims, with the necessary protection and assistance measures.
Amendment 63 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should directly contribute to
Amendment 64 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is demonstrably and durably effective and that respects
Amendment 65 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future- proof manner, so as not to hamper innovation.
Amendment 66 #
Proposal for a regulation Recital 4 a (new) Amendment 67 #
Proposal for a regulation Recital 4 a (new) (4a) In order to effectively prevent child sexual abuse, Member States should enhance all preventive measures, in particular, providing existing mechanisms, such as means for monitoring risk situations and identifying problems; Member States should also speed up the justice system assessment and response, carry out a rapid psychological assessment, protect children from contact with the abuser and support the family; it is also necessary to ensure priority mechanisms for continuous, universal and free therapeutic support to child victims, who can benefit from it throughout their lives, and to provide capacity to public health systems.
Amendment 68 #
Proposal for a regulation Recital 4 a (new) (4a) The existence of Child Sexual Abuse Material implies that child sexual abuse has already taken place. Detecting abuse material is important, but prevention is also vital. Therefore, member states should significantly strengthen educational measures to help children, teachers, and social services, to identify and report abuse, in particular by teaching children about consent from the earliest age possible, albeit in an age- appropriate manner.
Amendment 69 #
Proposal for a regulation Recital 4 b (new) (4b) Prevention plays a very significant role. Member States should ensure that they address the problem of online solicitation of children by providing efficient tools and materials for their digital education. Children should be given the necessary digital skills at home and in school in order to fully benefit from all the fields of the digital world, whilst ensuring their safety in the cyber sphere. The role of the parents is crucial. Parents should firstly be able to control and adequately supervise the children's behaviour in their online devices and advise them on safe online surfing; secondly they should be able to recognise children’s behaviours that result from sexual abuse online and preventing, if the case, potentially dangerous developments. Additionally, targeted cooperation with online platforms for awareness raising age appropriate and gender-targeted campaigns, tailored to the specific needs and interests of the children, can complement and help the education and guidance of the parent and children.
Amendment 70 #
Proposal for a regulation Recital 4 b (new) (4b) Many of the online risks associated with child abuse continue to pose a threat to adults, and many adults have already fallen victim, therefore this regulation should also focus on prevention of online risks, mandating the integration into applications of features that help children learn about, identify and avoid risks, making use of a "learning through doing" approach.
Amendment 71 #
Proposal for a regulation Recital 4 c (new) (4c) Fighting these brutal crimes, both online and in the real world, is a fundamental priority. In addition, it is essential to establish a fair balance between measures to protect child victims of sexual abuse and their fundamental rights and the protection of personal data, private and family life, freedom of expression and information. No child image should be subject to the production of illegal content and no child should be re-victimised by the sharing or repeated dissemination of child sexual abusive material which may reach extreme level in cases of so-called 'highly traded' material. The EU Centre and Coordinating Authorities should ensure the protection and support of the victims concerned when dealing with such requests from cases of highly traded child sexual abuse material.
Amendment 72 #
Proposal for a regulation Recital 4 c (new) (4c) The internet is an empowering and beneficial resource for children, allowing them to socialise, learn and play, however it can also pose significant risks. Many online services have set limits on the features accessible to children in order to mitigate these risks, however often depriving children of these features encourages them to lie about their age, or to try to evade age-verification systems. Therefore, rather than prohibiting access, services should focus on adapting their features and implementing safeguards for children.
Amendment 73 #
Proposal for a regulation Recital 4 d (new) (4d) The regulatory measures to address the dissemination of CSAM online shall be complemented by EU wide campaigns coordinated by the EU Center and the Coordinating Authorities of the Member States. Those campaigns shall include increasing public information and awareness of the phenomenon, seeking child-friendly and age-appropriate reporting, as well as informing about victims rights. Children need to be educated in a child friendly and child sensitive way, for the risks hidden of taking pictures or recording videos and sharing intimate pictures of themselves. An age appropriate comprehensive sexual education is thus very important in order to help children develop their understanding on what constitute a healthy relationship at an early age.
Amendment 74 #
Proposal for a regulation Recital 4 d (new) (4d) Developers should focus on responsibility by design, with the goal of preventing abuse, developing risk- mitigation and safety features for applications. To achieve this, it is important that developers understand how children use their services, and the threats they face. Therefore, children should be involved in the development process of risk-mitigation and safety features that are built for them.
Amendment 75 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services
Amendment 76 #
Proposal for a regulation Recital 6 (6) Online child sexual abuse
Amendment 77 #
Proposal for a regulation Recital 9 (9) Article 15(1) of Directive 2002/58/EC allows Member States to adopt legislative measures to restrict the scope of the rights and obligations provided for in certain specific provisions of that Directive relating to the confidentiality of communications when such restriction constitutes a necessary, appropriate and
Amendment 78 #
Proposal for a regulation Recital 9 a (new) (9a) Case law of the European Court of Justice43ahas repeatedly found indiscriminate monitoring of Communications is incompatible with the Charter of Fundamental Rights of the European Union, therefore detection orders should be targeted to individuals or groups suspected of child sexual abuse, and not at the wider population. _________________ 43a Cases C-511/18, C-512/18, C-520/18, and C-623/17 Court of Justice of the European Union
Amendment 79 #
Proposal for a regulation Recital 13 (13) The term ‘online child sexual abuse’ should cover not only the dissemination of material previously detected and confirmed as constituting child sexual abuse material (‘known’ material), but also of material not previously detected
Amendment 80 #
Proposal for a regulation Recital 14 (14) With a view to minimising the risk that their services are misused for the dissemination of
Amendment 81 #
Proposal for a regulation Recital 15 (15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] with respect to information that they store and disseminate to the public , which should form the basis for the risk assessment under this instrument. For the purposes of the present Regulation, those providers may draw on such a risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as
Amendment 82 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable measures to mitigate
Amendment 83 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child,
Amendment 84 #
Proposal for a regulation Recital 16 a (new) (16a) Parental controls that allow parents to access children’s private correspondence without their consent pose a significant risk to children’s privacy, but could also put at risk their safety, in particular in the cases of children who are being abused and who are trying to seek help, and LGBTQI+ children in hostile households. Therefore no provision in this legislation should enable or facilitate intrusions on children’s privacy.
Amendment 85 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk
Amendment 86 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect or prevent online child sexual abuse in their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authority.
Amendment 87 #
Proposal for a regulation Recital 17 a (new) (17a) Relying on providers for risk mitigation measures comes with inherent problems, as business models, technologies and crimes evolve continuously. As a result, clear targets, oversight, review and adaptation, led by national supervisory authorities are needed, to avoid measures becoming redundant, disproportionate, ineffective, counterproductive and outdated.
Amendment 88 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number-independent interpersonal communications services should, when designing and implementing the
Amendment 89 #
Proposal for a regulation Recital 19 (19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should
Amendment 90 #
Proposal for a regulation Recital 19 (19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable measures
Amendment 91 #
Proposal for a regulation Recital 19 a (new) (19a) Regulation (EU) 2022/1925 (the Digital Markets Act) sets out provisions to ensure competition in mobile device ecosystems, which would allow citizens to install software on their mobile devices directly, without using software application stores, bypassing age verification at the level of software application stores. Therefore manufacturers of operating systems deemed as gatekeepers under the Digital Markets Act should provide an application programming interface through which applications can request age verification, either through the European Digital Identity Wallet as defined in Regulation (EU) No XXX/2023 establishing a framework for a European Digital Identity, or through a third-party service. Manufacturers of operating systems deemed as gatekeepers should also provide a service to process age- verification requests in a manner that respects the privacy of the user and does not store a record of the services they accessed.
Amendment 92 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when
Amendment 93 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards, and targeted only to individuals suspected of child sexual abuse . For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to
Amendment 94 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those limits and safeguards,
Amendment 95 #
Proposal for a regulation Recital 22 (22)
Amendment 96 #
Proposal for a regulation Recital 22 (22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected
Amendment 97 #
Proposal for a regulation Recital 24 (24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, should be in a position to take a well-informed decision on requests for the issuance of detections orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especially in connection to detection orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual
Amendment 98 #
Proposal for a regulation Recital 24 (24) The competent judicial authority
Amendment 99 #
Proposal for a regulation Recital 25 source: 746.982
2023/07/28
LIBE
1633 amendments...
Amendment 1000 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 1001 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 1002 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 1003 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 1004 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 1005 #
Proposal for a regulation Article 7 – paragraph 6 Amendment 1006 #
Proposal for a regulation Article 7 – paragraph 7 Amendment 1007 #
Proposal for a regulation Article 7 – paragraph 7 Amendment 1008 #
Proposal for a regulation Article 7 – paragraph 7 Amendment 1009 #
Proposal for a regulation Article 7 – paragraph 7 Amendment 1010 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 Amendment 1011 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 Amendment 1012 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 1 Amendment 1013 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 2 Amendment 1014 #
Proposal for a regulation Article 7 – paragraph 7 – subparagraph 2 The detection orders concerning the solicitation of children shall apply only to interpersonal communications where one of the users is a child user and the other one an adult.
Amendment 1015 #
Proposal for a regulation Article 7 – paragraph 8 Amendment 1016 #
Proposal for a regulation Article 7 – paragraph 8 Amendment 1017 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall, in accordance with Article 8 of Regulation (EU) 2022/2065, target and specify it in such a manner that the negative consequences referred to in paragraph
Amendment 1018 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or
Amendment 1019 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall in accordance with Article 8 of Regulation (EU) 2022/2065 target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary, justifiable and proportionate to effectively address the
Amendment 1020 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the judicial validation and issuance of detection orders, and the competent judicial
Amendment 1021 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection orders, and
Amendment 1022 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 1 The Coordinating Authority of establishment when requesting the issuance of detection
Amendment 1023 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that aim, they shall take into account all relevant parameters, including: (i) the availability of sufficiently reliable detection technologies in that they can be deployed without undermining the security of the service in question and they limit to the maximum extent possible the rate of errors regarding the detection
Amendment 1024 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that aim, they shall take into account all relevant parameters, including the technical feasability, availability of sufficiently reliable detection technologies in that they limit to the maximum extent possible the rate of errors regarding the detection and their suitability and effectiveness for achieving the objectives of this Regulation,
Amendment 1025 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that aim, they shall take into account all relevant parameters, including: (i) the availability of sufficiently reliable detection technologies in that they can be deployed without undermining the security of the service in question and they limit to the maximum extent possible the rate of errors regarding the detection
Amendment 1026 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that aim, they shall take into account all relevant parameters, including the technical feasibility, availability of sufficiently reliable detection technologies in that they limit to the maximum extent possible the rate of errors regarding the detection and their suitability and effectiveness for achieving the objectives of this Regulation,
Amendment 1027 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 2 To that
Amendment 1028 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 Amendment 1029 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 Amendment 1030 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point a (a) where th
Amendment 1031 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point a (a) where the information gathered in the risk assessment process indicates that risk is limited to an identifiable part or component of a service, where possible without prejudice to the effectiveness of the measure, the required measures are only applied in respect of that part or component;
Amendment 1032 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point a (a) where th
Amendment 1033 #
Proposal for a regulation Article 7 – paragraph 8 – subparagraph 3 – point b (b) where necessary, in particular to limit such negative consequences, effective and proportionate safeguards additional to those listed in Article 10(4)
Amendment 1034 #
Proposal for a regulation Article 7 – paragraph 9 Amendment 1035 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority or independent administrative authority shall specify in the detection order the period during which it applies, indicating the start date and the end date
Amendment 1036 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 Amendment 1037 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority or independent administrative authority shall specify in the targeted detection order the period during which it applies, indicating the start date and the end date.
Amendment 1038 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 1 The competent judicial authority
Amendment 1039 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 2 The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the detection
Amendment 1040 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 2 The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the targeted detection order. It shall not be earlier than three months from the date at which the provider received the targeted detection order and not be later than 12 months from that date.
Amendment 1041 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 2 The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the detection
Amendment 1042 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 The period of application of detection orders
Amendment 1043 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 The period of application of detection
Amendment 1044 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 The period of application of targeted detection orders concerning the dissemination of known or new child sexual abuse material shall not exceed 24
Amendment 1045 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 The period of application of detection
Amendment 1046 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 a (new) The European Data Protection Board shall also issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the detection of child sexual abuse material in encrypted and non-encrypted environments.Supervisory authorities as referred to in that Regulation shall supervise the application of those guidelines. Prior to the use of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
Amendment 1047 #
Proposal for a regulation Article 7 – paragraph 9 – subparagraph 3 b (new) The competent supervisory authorities designated pursuant to Chapter VI, Section 1, of Regulation (EU) 2016/678 shall have the right to challenge a detection warrant within the competence pursuant to Chapter VI, Section 2 of Regulation (EU) 2016/678 before the courts of the Member State of the competent judicial authority that issued the detection warrant.
Amendment 1048 #
Proposal for a regulation Article 7 – paragraph 9 a (new) 9a. The competent supervisory authorities designated pursuant to Chapter VI, Section 1, of Regulation (EU) 2016/678 shall have the right to challenge a detection warrant within the competence pursuant to Chapter VI, Section 2 of Regulation (EU) 2016/678 before the competent judicial authority that issued the detection warrant.
Amendment 1049 #
Proposal for a regulation Article 7 a (new) Article7a Safeguards on encrypted services For the scope of this Regulation and for the the sole purpose to prevent and combat child sexual abuse, providers of interpersonal communications services shall be subjected to obligations to prevent, detect, report and remove online child sexual abuse on all their services, which may include as well those covered by end-to-end encyption, when there is a significant risk that their specific service is misused for online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment established in Article 3 of this Regulation. The technologies deployed to execute the detection order pursuant to Article 7 of this Regulation shall never prohibit encryption or make it impossible and shall only be deployed after a prior authorization by the Coordinating Authority, in consultation with the competent data protection authority, and be subjected to constant monitoring and auditing by the competent data protection authority to verify their compliance with Union law.
Amendment 1050 #
Proposal for a regulation Article 7 a (new) Article7a Safeguards on encrypted services For the scope of this regulation and for the sole purpose to prevent and combat child sexual abuse, providers of interpersonal communications services shall be subjected to obligations to prevent, detect, report and remove online child sexual abuse on all their services, which may include as well those covered by end-to-end encryption, when there is a significant risk that their specific service is misused for online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment established in Article 3 of this Regulation. The technologies deployed to execute the detection order pursuant to Article 7 of this Regulation shall never prohibit or make encryption impossible and only be deployed after a prior authorization by the Coordinating Authority, in consultation with the competent data protection authority, and be subjected to constant monitoring and auditing by the competent data protection authority to verify their compliance with Union law.
Amendment 1054 #
Proposal for a regulation Article 8 – title Additional rules regarding targeted detection orders
Amendment 1055 #
Proposal for a regulation Article 8 – title Additional rules regarding detection
Amendment 1056 #
Proposal for a regulation Article 8 – title Additional rules regarding detection
Amendment 1057 #
Proposal for a regulation Article 8 – paragraph 1 Amendment 1058 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1.
Amendment 1059 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 1060 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 1061 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 1062 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 1063 #
Proposal for a regulation Article 8 – paragraph 1 – introductory part 1. The competent judicial authority
Amendment 1064 #
Proposal for a regulation Article 8 – paragraph 1 – point a (a) information regarding the specific measures to be taken to execute the detection order, including the specific person or specific persons the detection must concern, the temporal scope, indicators to be used and the safeguards to be provided for, including the reporting requirements set pursuant to Article 9(3) and, where applicable, any additional safeguards as referred to in Article 7(8);
Amendment 1065 #
Proposal for a regulation Article 8 – paragraph 1 – point a (a) information regarding the measures to be taken to execute the detection order, including the indicators to be used and the safeguards to be provided for, including the reporting requirements set pursuant to Article 9(3)
Amendment 1066 #
Proposal for a regulation Article 8 – paragraph 1 – point a a (new) (aa) information, with respect to each device or user account, detailing the specific purpose and scope of the warrant, including the legal basis for the reasonable suspicion.
Amendment 1067 #
Proposal for a regulation Article 8 – paragraph 1 – point b (b) identification details of the competent judicial authority
Amendment 1068 #
Proposal for a regulation Article 8 – paragraph 1 – point b (b) identification details of the competent judicial authority
Amendment 1069 #
Proposal for a regulation Article 8 – paragraph 1 – point c a (new) (ca) (c) the name of the user(s) for whom a targeted detection order has been issued, insofar it is known, and digital aliases in use by the user(s).
Amendment 1070 #
Proposal for a regulation Article 8 – paragraph 1 – point d (d) the specific service in respect of which the targeted detection order is issued and, where applicable, the part or component of the service affected as referred to in Article 7(8);
Amendment 1071 #
Proposal for a regulation Article 8 – paragraph 1 – point e Amendment 1072 #
Proposal for a regulation Article 8 – paragraph 1 – point e Amendment 1073 #
Proposal for a regulation Article 8 – paragraph 1 – point e Amendment 1074 #
Proposal for a regulation Article 8 – paragraph 1 – point e (e) whether the targeted detection order issued concerns the dissemination of known or new child sexual abuse material
Amendment 1075 #
Proposal for a regulation Article 8 – paragraph 1 – point e a (new) (ea) the person or group of persons covered by the detection order and specifics of the suspicion of illegal activities;
Amendment 1076 #
Proposal for a regulation Article 8 – paragraph 1 – point f (f) the start date and the end date of the targeted detection order;
Amendment 1077 #
Proposal for a regulation Article 8 – paragraph 1 – point g (g) a sufficiently detailed
Amendment 1078 #
Proposal for a regulation Article 8 – paragraph 1 – point g (g) a sufficiently detailed statement of
Amendment 1079 #
Proposal for a regulation Article 8 – paragraph 1 – point g (g) a
Amendment 1080 #
Proposal for a regulation Article 8 – paragraph 1 – point h (h) the factual and legal grounds justifying the issuing of the order, and a reference to this Regulation as the legal basis for the detection order;
Amendment 1081 #
Proposal for a regulation Article 8 – paragraph 1 – point h (h) a reference to this Regulation as the legal basis for the targeted detection order;
Amendment 1082 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 1083 #
Proposal for a regulation Article 8 – paragraph 1 – point i (i) the date, time stamp and electronic signature of the judicial
Amendment 1084 #
Proposal for a regulation Article 8 – paragraph 1 – point j (j) easily understandable information about the redress available to the addressee of the targeted detection order, including information about redress to a court and about the time periods applicable to such redress.
Amendment 1085 #
Proposal for a regulation Article 8 – paragraph 2 Amendment 1086 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The co
Amendment 1087 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 1088 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 1 The competent judicial authority
Amendment 1089 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 2 The targeted detection order shall be transmitted to the provider’s point of contact referred to in Article 23(1), to the Coordinating Authority of establishment and to the EU Centre, through the system established in accordance with Article 39(2).
Amendment 1090 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 2 The detection order shall be securly transmitted to the provider’s point of contact referred to in Article 23(1), to the Coordinating Authority of establishment and to the EU Centre, through the system established in accordance with Article 39(2).
Amendment 1091 #
Proposal for a regulation Article 8 – paragraph 2 – subparagraph 3 The targeted detection order shall be drafted in the language declared by the provider pursuant to Article 23(3).
Amendment 1092 #
Proposal for a regulation Article 8 – paragraph 3 Amendment 1093 #
Proposal for a regulation Article 8 – paragraph 3 3. If the provider cannot execute the detection order because it contains
Amendment 1094 #
Proposal for a regulation Article 8 – paragraph 3 3. If the provider cannot execute the detection order because it contains manifest errors or does not contain sufficient information for its execution, the provider shall, without undue delay,
Amendment 1095 #
Proposal for a regulation Article 8 – paragraph 4 Amendment 1096 #
Proposal for a regulation Article 8 – paragraph 4 4. The Commission shall be empowered to adopt delegated acts in accordance with Article 86 in order to amend Annex
Amendment 1097 #
Proposal for a regulation Article 8 a (new) Article8a Preservation of data in the context of detection orders 1. Detection orders may require the expedited preservation by the provider, insofar as the data is under their control, of one or more of the following data concerning the specific users against whom the detection order is directed, including new data generated after issuance of the order, as part of a planned or current criminal investigation; a) Traffic data: i) Pseudonyms, screen names or other identifiers used by the subject(s) of the investigation; ii) Network identifiers, such as IP addresses, port numbers, or MAC addresses used by, or associated with, the subject(s) of the investigation; iii) Any other traffic data, including metadata, of any activity linked to subject(s) of the investigation; b) Content data: i) Copies of any pictures or videos uploaded, downloaded or otherwise communicated by the subject(s) of the investigation; 2. Access to the data shall be made available to law enforcement authorities on the basis of the national law of the country of establishment of the provider. 3. The provider shall inform all users concerned of the order, unless the issuing authority instructs it, on the basis of a reasoned opinion, not to do so.
Amendment 1101 #
Redress, information, reporting and modification of targeted detection orders
Amendment 1102 #
Proposal for a regulation Article 9 – title 9 Redress, information, reporting and modification of detection
Amendment 1103 #
Proposal for a regulation Article 9 – paragraph 1 Amendment 1104 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of number independent interpersonal communications services that have received a detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detection order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
Amendment 1105 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of number-independent interpersonal communications services that have received a detection
Amendment 1106 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of interpersonal communications services that have received a targeted detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the targeted detection order before the courts of the Member State of the competent judicial authority
Amendment 1107 #
Proposal for a regulation Article 9 – paragraph 1 1. Providers of hosting services and providers of number-independent interpersonal communications services that have received a detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detection order before the courts of the Member
Amendment 1108 #
Proposal for a regulation Article 9 – paragraph 2 Amendment 1109 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the detection order becomes final, the co
Amendment 1110 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the detection order becomes final, the competent judicial authority
Amendment 1111 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 1 When the targeted detection order becomes final, the competent judicial authority
Amendment 1112 #
Proposal for a regulation Article 9 – paragraph 2 – subparagraph 2 For the purpose of the first subparagraph, a targeted detection order shall become final upon the expiry of the time period for appeal where no appeal has been lodged in accordance with national law or upon confirmation of the targeted detection order following an appeal.
Amendment 1113 #
Proposal for a regulation Article 9 – paragraph 3 Amendment 1114 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 1 Where the period of application of the detection order exceeds 12 months,
Amendment 1115 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 1 Where the period of application of the detection order exceeds 12 months,
Amendment 1116 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 1 Where the period of application of the detection order exceeds
Amendment 1117 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 2 Those reports shall include a detailed description of the measures taken to execute the detection order, including the safeguards provided, and information on the functioning in practice of those measures,
Amendment 1118 #
Proposal for a regulation Article 9 – paragraph 3 – subparagraph 2 Those reports shall include a detailed description of the measures taken to execute the detection order, including the safeguards provided, and information on the functioning in practice of those measures, in particular on their effectiveness in detecting the dissemination of known or new child sexual abuse material
Amendment 1119 #
Proposal for a regulation Article 9 – paragraph 4 Amendment 1120 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the targeted detection orders that the competent judicial authority
Amendment 1121 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 1 In respect of the detection orders that the competent judicial authority
Amendment 1122 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 1123 #
Proposal for a regulation Article 9 – paragraph 4 – subparagraph 2 That Coordinating Authority shall request to the competent judicial authority
Amendment 1127 #
Proposal for a regulation Article 10 – paragraph 1 Amendment 1128 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of interpersonal communication services that have received a detection order concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
Amendment 1129 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of interpersonal communication services that have received a detection order or undertake voluntary detection measures in accordance with Article 4a, shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
Amendment 1130 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of number-independent interpersonal communication services that have received a detection
Amendment 1131 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of interpersonal communication services that have received a targeted detection order shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46 and with Article 6a.
Amendment 1132 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of interpersonal communication services that have received a detection order
Amendment 1133 #
Proposal for a regulation Article 10 – paragraph 1 1. Providers of hosting services and providers of number-independent interpersonal communication services that have received a detection
Amendment 1134 #
Proposal for a regulation Article 10 – paragraph 2 Amendment 1135 #
Proposal for a regulation Article 10 – paragraph 2 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies specified in the orders and made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order. The
Amendment 1136 #
Proposal for a regulation Article 10 – paragraph 2 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order and, where needed, of adopting the security measures imposed by Article 7(3)(a). The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
Amendment 1137 #
Proposal for a regulation Article 10 – paragraph 2 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU
Amendment 1138 #
Proposal for a regulation Article 10 – paragraph 2 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order or voluntary detection. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
Amendment 1139 #
Proposal for a regulation Article 10 – paragraph 2 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the targeted detection order. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
Amendment 1140 #
Proposal for a regulation Article 10 – paragraph 2 2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection
Amendment 1141 #
Proposal for a regulation Article 10 – paragraph 3 Amendment 1142 #
Proposal for a regulation Article 10 – paragraph 3 – introductory part 3. The technologies specified in the detection orders shall
Amendment 1143 #
Proposal for a regulation Article 10 – paragraph 3 – point a (a) be effective in
Amendment 1144 #
Proposal for a regulation Article 10 – paragraph 3 – point a (a) effective in detecting the dissemination of known
Amendment 1145 #
Proposal for a regulation Article 10 – paragraph 3 – point a (a) effective in detecting the dissemination of known
Amendment 1146 #
Proposal for a regulation Article 10 – paragraph 3 – point b (b) not be able to extract nor deduce the substance of the content of the communications or any other information, from the relevant communications other than the information strictly necessary to detect, using the indicators referred to in paragraph 1, patterns pointing to the dissemination of
Amendment 1147 #
Proposal for a regulation Article 10 – paragraph 3 – point b (b) not be able to extract any other information from the relevant communications than the information strictly necessary to detect, using the indicators referred to in paragraph 1, patterns pointing to the dissemination of known
Amendment 1148 #
Proposal for a regulation Article 10 – paragraph 3 – point b (b) not be able to extract any other information from the relevant communications than the information strictly necessary to detect, using the indicators referred to in paragraph 1, patterns pointing to the dissemination of known
Amendment 1149 #
Proposal for a regulation Article 10 – paragraph 3 – point c (c) in accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users’ rights to private and family life, including the confidentiality of communication, and to protection of personal data. It shall not weaken or undermine end-to-end encryption and shall not limit providers of information society services from providing their services applying end-to- end encryption;
Amendment 1150 #
Proposal for a regulation Article 10 – paragraph 3 – point c (c) be in accordance with the technological state of the art
Amendment 1151 #
(d) be sufficiently reliable, in that they limit to the maximum extent possible the rate of errors
Amendment 1152 #
Proposal for a regulation Article 10 – paragraph 3 – point d (d) sufficiently reliable, in that they limit to the maximum extent possible the rate of errors regarding the detection
Amendment 1153 #
Proposal for a regulation Article 10 – paragraph 3 – point d (d) sufficiently reliable, in that they limit to the maximum extent possible the rate of errors regarding the detection
Amendment 1154 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) for searching known child sexual abuse material, create a unique, non- reconvertible digital signature (ʻhashʼ) of electronically communicated pictures or videos for the sole purpose of immediately comparing that hash with a database containing hashes of material previously reliably identified as child sexual abuse and exploitation material as provided by the EU Centre pursuant to Article 44(1);
Amendment 1155 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) the technologies used to detect patterns of possible solicitation of children are limited to the use of relevant key indicators and objectively identified risk factors such as age difference and the likely involvement of a child in the scanned communication, without prejudice to the right to human review.
Amendment 1156 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) effective in setting up a reliable age-based filter that verifies the age of users and effectively prevents the access of child users to websites subject to online child sexual abuse, and child sexual abuse offenses.
Amendment 1157 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) (e) focused on communications where there is an established suspicion of illegal activity and the technologies shall not lead to general monitoring of private communications;
Amendment 1158 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) ensure that the interference with the fundamental right to privacy and the other rights laid down in the Charter is limited to what is strictly necessary.
Amendment 1159 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) ensure that the interference with the fundamental right to privacy and the other rights laid down in the Charter is limited to what is strictly necessary
Amendment 1160 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) not able to weaken end-to end encryption and to lead to a general monitoring of private comunications.
Amendment 1161 #
(da) not able to prohibit or make end- to-end encryption impossible.
Amendment 1162 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) not able to prohibit or make end- to-end encryption impossible.
Amendment 1163 #
Proposal for a regulation Article 10 – paragraph 3 – point d a (new) (da) not able to weaken end-to-end encryption.
Amendment 1164 #
Proposal for a regulation Article 10 – paragraph 3 – point d b (new) (db) ensure the processing is limited to what is strictly necessary for the purpose of detection, reporting and removal of child sexual abuse material and, unless child sexual abuse material has been detected and confirmed as such, the data is erased immediately;
Amendment 1165 #
Proposal for a regulation Article 10 – paragraph 3 – point d c (new) (dc) ensure the processing does not interfere with, weaken, or circumvent the security of encrypted communications, and only applies to unencrypted communications;
Amendment 1166 #
Proposal for a regulation Article 10 – paragraph 4 Amendment 1167 #
Proposal for a regulation Article 10 – paragraph 4 Amendment 1168 #
Proposal for a regulation Article 10 – paragraph 4 – introductory part 4. The
Amendment 1169 #
Proposal for a regulation Article 10 – paragraph 4 – point -a (new) (-a) ensure privacy by design and safety-by-design and by default and, where applicable, the protection of encryption.
Amendment 1170 #
Proposal for a regulation Article 10 – paragraph 4 – point a (a) request, in respect of any specific technology used for the purpose set out in this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679, and request a mandatory prior consultation procedure as referred to in Article 36 of that Regulation have been conducted and take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known
Amendment 1171 #
Proposal for a regulation Article 10 – paragraph 4 – point a (a) take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly necessary to use voluntary measures, when authorised, or execut
Amendment 1172 #
Proposal for a regulation Article 10 – paragraph 4 – point a (a) take all the necessary measures to ensure that the technologies specified in detection orders and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of
Amendment 1173 #
Proposal for a regulation Article 10 – paragraph 4 – point a (a) take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material
Amendment 1174 #
Proposal for a regulation Article 10 – paragraph 4 – point a a (new) (aa) ensure privacy by design and safety-by-design and by default and, where applicable, the protection of encryption.
Amendment 1175 #
Proposal for a regulation Article 10 – paragraph 4 – point b (b)
Amendment 1176 #
Proposal for a regulation Article 10 – paragraph 4 – point b (b) establish effective internal procedures to prevent and, where necessary, detect and remedy any misuse of the technologies, indicators and personal data and other data referred to in point (a), including unauthori
Amendment 1177 #
Proposal for a regulation Article 10 – paragraph 4 – point c (c) ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, human intervention; ensure that any malfunctions or defects in the technologies used are remedied within a matter of hours;
Amendment 1178 #
Proposal for a regulation Article 10 – paragraph 4 – point c (c) include in detection orders specific obligations on providers ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors
Amendment 1179 #
(c) ensure
Amendment 1180 #
Proposal for a regulation Article 10 – paragraph 4 – point c (c) ensure
Amendment 1181 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible,
Amendment 1182 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible, age-appropriate and user- and child- friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner ;
Amendment 1183 #
Proposal for a regulation Article 10 – paragraph 4 – point d (d) establish and operate an accessible, age-appropriate and user- and child- friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
Amendment 1184 #
Proposal for a regulation Article 10 – paragraph 4 – point e (e) inform the Coordinating Authority and competent Data Protection Authority, at the latest one month before the start date specified in the detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
Amendment 1185 #
Proposal for a regulation Article 10 – paragraph 4 – point e (e) inform the Coordinating Authority and Data Protection Authorities, at the latest one month before the start date specified in the detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
Amendment 1186 #
Proposal for a regulation Article 10 – paragraph 4 – point e (e) inform the Coordinating Authority, at the latest one month before the start date specified in the targeted detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
Amendment 1187 #
Proposal for a regulation Article 10 – paragraph 4 – point e a (new) (ea) request in respect of any specific technology used for the purpose set out in this Article, a prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679, and request a prior consultation procedure as referred to in Article 36 of that Regulation have been conducted;
Amendment 1188 #
Proposal for a regulation Article 10 – paragraph 4 – point f a (new) (fa) ensure privacy without hampering the integrity of encryption and without leading to a general monitoring of private communications.
Amendment 1189 #
Proposal for a regulation Article 10 – paragraph 4 – point f a (new) (fa) ensure privacy by design and by default and, where applicable, without hampering the integrity of encryption.
Amendment 1190 #
Proposal for a regulation Article 10 – paragraph 4 a (new) 4a. in respect of any specific technology used for the purpose set out in this Article, conduct a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation;
Amendment 1191 #
Proposal for a regulation Article 10 – paragraph 5 Amendment 1192 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point a (a) the fact that it operates technologies to detect
Amendment 1193 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point a (a) the fact that it operates technologies to detect
Amendment 1194 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point a (a) the fact that it operates technologies to detect online child sexual abuse to execute the detection order, the ways in which it operates those technologies and the impact on the confidentiality of users’ communications and on personal data protection;
Amendment 1195 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point a (a) the fact that it operates technologies to detect online child sexual abuse to execute the targeted detection order, the ways in which it operates those technologies and the impact on the confidentiality of users’ communications;
Amendment 1196 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point a (a) the fact that it operates technologies to detect
Amendment 1197 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point b Amendment 1198 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 1 – point c (c) the users’ right of judicial redress referred to in Article 9(1) and their rights to submit complaints to the provider through the mechanism referred to in paragraph 4, point (d) and to the Data Protection Authority and Coordinating Authority in accordance with Article 34.
Amendment 1199 #
Proposal for a regulation Article 10 – paragraph 5 – subparagraph 2 The provider shall not provide information to users that may reduce the effectiveness of the measures to execute the targeted detection order, notwithstanding Article 6a and general advice on confidential communication.
Amendment 1200 #
Proposal for a regulation Article 10 – paragraph 6 Amendment 1201 #
Proposal for a regulation Article 10 – paragraph 6 Amendment 1202 #
Proposal for a regulation Article 10 – paragraph 6 6. Where a provider detects potential online child sexual abuse through the
Amendment 1203 #
Proposal for a regulation Article 10 – paragraph 6 6. Where a provider detects potential
Amendment 1204 #
Proposal for a regulation Article 10 a (new) Article10a Safeguarding end-to-end encryption The integrity of end-to-end encryption services must be safeguarded. The detection obligations set out in this section shall therefore not apply to end-to-end encryption services. This includes, inter alia, no possibility within end-to-end encryption technology to build in so called ‘backdoors’ i.e. client-side scanning with side-channel leaks which could weaken the end-to-end encryption and lead to a third part getting access to private data. Client-side scanning, when a message is scanned twice, on sending and receiving, threatens the integrity and privacy of users. Such ‘backdoors’ shall not be built in on end-to-end encryption in the pursuit of enforcing this regulation.
Amendment 1205 #
Proposal for a regulation Article 11 Amendment 1206 #
Proposal for a regulation Article 11 Amendment 1207 #
Proposal for a regulation Article 11 – title Amendment 1208 #
Amendment 1209 #
Proposal for a regulation Article 11 – paragraph 1 The Commission, in cooperation with the European Data Protection Board, Fundamental Rights Agency, Coordinating Authorities and the EU Centre and after having conducted a public consultation,
Amendment 1210 #
Proposal for a regulation Article 11 – paragraph 1 The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments and trends reported by law enforcement, hotlines and civil society and the manners in which the services covered by those provisions are offered and used.
Amendment 1211 #
Proposal for a regulation Article 11 – paragraph 1 The Commission, in cooperation with the Coordinating Authorities, and the EU Centre, after having consulted the European Data Protection Board and after having conducted a public consultation, may issue
Amendment 1212 #
Proposal for a regulation Article 11 – paragraph 1 The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having consulted the European Data Protection Board and having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments and the manners in which the services covered by those provisions are offered and used.
Amendment 1213 #
Proposal for a regulation Chapter II – Section 3 – title 3 Reporting and removal obligations
Amendment 1214 #
Reporting and removal obligations
Amendment 1215 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of number- independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of a
Amendment 1216 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of publicly available number-independent interpersonal
Amendment 1217 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of number- independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential
Amendment 1218 #
Proposal for a regulation Article 12 – paragraph 1 1. Where a provider of hosting services or a provider of number independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
Amendment 1219 #
Proposal for a regulation Article 12 – paragraph 1 a (new) 1a. Where a provider of hosting services has actual knowledge of online child sexual abuse material on its services and of its unlawful nature it shall expeditiously remove or disable access to it in all Member States.
Amendment 1220 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 1 Where the provider submits a report pursuant to paragraph 1, it shall
Amendment 1221 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 1 Where the provider submits a report pursuant to paragraph 1, it shall inform the user concerned, providing information on the main content of the report, on the
Amendment 1222 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 2 Amendment 1223 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 2 The provider shall inform the user concerned without undue delay
Amendment 1224 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 2 The provider shall inform the user concerned without undue delay, either after having received a communication from the EU Centre indicating that it considers the report to be
Amendment 1225 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 3 Amendment 1226 #
Proposal for a regulation Article 12 – paragraph 2 – subparagraph 3 Amendment 1227 #
Proposal for a regulation Article 12 – paragraph 2 a (new) 2a. The EU Center shall coordinate with the relevant competent authority the requests it receives for exercise of individuals’ right of access, rectification and deletion in relation to personal data processed pursuant to this Regulation.
Amendment 1228 #
Proposal for a regulation Article 12 – paragraph 3 Amendment 1229 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider
Amendment 1230 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible, age-appropriate, child-friendly and user-friendly mechanism, including self-reporting tools, that allows users to flag or notify to the provider potential online child sexual abuse on the services. Those mechanisms shall allow for anonymous reporting already available through anonymous reporting channels as defined by Directive (EU) 2019/1937.
Amendment 1231 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible,
Amendment 1232 #
Proposal for a regulation Article 12 – paragraph 3 3. The provider shall establish and operate an accessible, age-appropriate and child- and user-friendly mechanism that allows users to flag to the provider potential online child sexual abuse on the service.
Amendment 1233 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in
Amendment 1234 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number-independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 1235 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number-independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 1236 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 1237 #
Proposal for a regulation Article 13 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number-independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 1238 #
Proposal for a regulation Article 13 – paragraph 1 – point a a (new) (aa) where applicable, an exact uniform resource locator and, where necessary, additional information for the identification of the child sexual abuse material
Amendment 1239 #
Proposal for a regulation Article 13 – paragraph 1 – point a b (new) (ab) the specific technology that enabled the provider to become aware of the potential online child sexual abuse;
Amendment 1240 #
Proposal for a regulation Article 13 – paragraph 1 – point c (c)
Amendment 1241 #
Proposal for a regulation Article 13 – paragraph 1 – point c (c)
Amendment 1242 #
Proposal for a regulation Article 13 – paragraph 1 – point c (c) all content data
Amendment 1243 #
Proposal for a regulation Article 13 – paragraph 1 – point c a (new) (ca) where applicable, an exact uniform resource locator and, where necessary, additional information for the identification of the child sexual abuse material;
Amendment 1244 #
Proposal for a regulation Article 13 – paragraph 1 – point c a (new) (ca) information on the reporting mechanism or specific technology used to detect the content;
Amendment 1245 #
Proposal for a regulation Article 13 – paragraph 1 – point d Amendment 1246 #
Proposal for a regulation Article 13 – paragraph 1 – point d Amendment 1247 #
Proposal for a regulation Article 13 – paragraph 1 – point d (d)
Amendment 1248 #
Proposal for a regulation Article 13 – paragraph 1 – point d (d)
Amendment 1249 #
Proposal for a regulation Article 13 – paragraph 1 – point d a (new) (da) a list of all traffic data and metadata retained in relation to the potential online child sexual abuse, which could be made available to law enforcement authorities, together with information concerning default retention periods.
Amendment 1250 #
Proposal for a regulation Article 13 – paragraph 1 – point e Amendment 1251 #
Proposal for a regulation Article 13 – paragraph 1 – point e Amendment 1252 #
Proposal for a regulation Article 13 – paragraph 1 – point e (e) whether the potential online child sexual abuse to their knowledge concerns the dissemination of known or new child sexual abuse material or the solicitation of children;
Amendment 1253 #
Proposal for a regulation Article 13 – paragraph 1 – point f Amendment 1254 #
Proposal for a regulation Article 13 – paragraph 1 – point f Amendment 1255 #
Proposal for a regulation Article 13 – paragraph 1 – point f (f) information concerning the apparent geographic location related to the potential online child sexual abuse, such as the Internet Protocol address;
Amendment 1256 #
Proposal for a regulation Article 13 – paragraph 1 – point f (f) information concerning the geographic location related to the
Amendment 1257 #
Proposal for a regulation Article 13 – paragraph 1 – point g (g) a list of available information
Amendment 1258 #
Proposal for a regulation Article 13 – paragraph 1 – point g (g) information concerning the identity of any user involved in the
Amendment 1259 #
Proposal for a regulation Article 13 – paragraph 1 – point g a (new) (ga) whether the provider considers that the report involves and imminent threat to the life or safety of a child or requires urgent action;
Amendment 1260 #
Proposal for a regulation Article 13 – paragraph 1 – point h (h) whether the provider has also reported, or will also report, the
Amendment 1261 #
Proposal for a regulation Article 13 – paragraph 1 – point i (i) where the
Amendment 1262 #
Proposal for a regulation Article 13 – paragraph 1 – point i (i) where the
Amendment 1263 #
Proposal for a regulation Article 13 – paragraph 1 – point i a (new) (ia) information on the specific technology that enabled the provider to become aware of the relevant abusive content, in case the provider became aware of the potential child sexual abuse following measures taken to execute a detection order issued in accordance with Article 7 of the Proposal.
Amendment 1264 #
Proposal for a regulation Article 13 – paragraph 1 – point j (j) whether the provider considers that the report is indicative of an imminent threat to the life or safety of a child or otherwise requires urgent action;
Amendment 1265 #
Proposal for a regulation Article 13 – paragraph 1 – point j a (new) (ja) information on the tools used by the provider to become aware of the reported online child sexual abuse, including data and aggregate statistics on how technologies used by the provider work;
Amendment 1266 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 1267 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 1268 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 1269 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 1270 #
Proposal for a regulation Article 14 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove
Amendment 1271 #
Proposal for a regulation Article 14 – paragraph 2 2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider, but in any case no longer than 3 days.
Amendment 1272 #
Proposal for a regulation Article 14 – paragraph 2 2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
Amendment 1273 #
Proposal for a regulation Article 14 – paragraph 2 2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
Amendment 1274 #
Proposal for a regulation Article 14 – paragraph 2 2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
Amendment 1275 #
Proposal for a regulation Article 14 – paragraph 3 – introductory part 3. The competent judicial authority
Amendment 1276 #
Proposal for a regulation Article 14 – paragraph 3 – point a (a) identification details of the judicial
Amendment 1277 #
Proposal for a regulation Article 14 – paragraph 3 – point c Amendment 1278 #
Proposal for a regulation Article 14 – paragraph 3 – point g (g) a reference to Article 14 of this Regulation as the legal basis for the removal order;
Amendment 1279 #
Proposal for a regulation Article 14 – paragraph 3 – point h (h) the date, time stamp and electronic signature of the judicial
Amendment 1280 #
Proposal for a regulation Article 14 – paragraph 4 – subparagraph 1 The judicial authority
Amendment 1281 #
Proposal for a regulation Article 14 – paragraph 5 – subparagraph 1 If the provider cannot execute the removal order on grounds of force majeure or de facto impossibility
Amendment 1282 #
Proposal for a regulation Article 14 – paragraph 5 a (new) 5a. If the provider considers that the removal order has not been issued in accordance with this Article, or is manifestly abusive, it shall refuse to execute the order and provide a reasoned justification to the Coordinating Authority that issued the order.
Amendment 1283 #
Proposal for a regulation Article 14 – paragraph 7 7. The provider shall, without undue delay and using the template set out in Annex VI, inform the Coordinating Authority of establishment and the EU Centre, of the measures taken to execute the removal order, indicating, in particular, whether the provider removed the child sexual abuse material
Amendment 1284 #
Proposal for a regulation Article 14 – paragraph 8 a (new) 8a. Where Europol or a national authority become aware of the presence of child sexual abuse material on a hosting service, they shall notify the Coordinating authority of its exact uniform resource locator, and the Coordinating authority shall request a removal order where the conditions of paragraph 1 are met.
Amendment 1285 #
Proposal for a regulation Article 15 – paragraph 1 1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority
Amendment 1286 #
1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority
Amendment 1287 #
Proposal for a regulation Article 15 – paragraph 2 – subparagraph 1 When the removal order becomes final, the competent judicial authority
Amendment 1288 #
(a) the fact that it removed the material
Amendment 1289 #
Proposal for a regulation Article 15 – paragraph 3 – point b (b) the reasons for the removal
Amendment 1290 #
Proposal for a regulation Article 15 – paragraph 3 – point b (b) the reasons for the removal or disabling, providing a copy of the removal order
Amendment 1291 #
Amendment 1292 #
Proposal for a regulation Article 15 – paragraph 4 – subparagraph 1 The Coordinating Authority of establishment may request, when requesting the judicial authority or independent administrative authority issuing the removal order, and after having consulted with relevant public authorities, that the provider is not to disclose any information regarding the removal of
Amendment 1296 #
Amendment 1301 #
Proposal for a regulation Article 16 – paragraph 1 Amendment 1302 #
Proposal for a regulation Article 16 – paragraph 2 Amendment 1303 #
Proposal for a regulation Article 16 – paragraph 3 Amendment 1304 #
Proposal for a regulation Article 16 – paragraph 4 Amendment 1305 #
Proposal for a regulation Article 16 – paragraph 5 Amendment 1306 #
Proposal for a regulation Article 16 – paragraph 6 Amendment 1307 #
Proposal for a regulation Article 16 – paragraph 7 Amendment 1313 #
Proposal for a regulation Article 17 – paragraph 1 Amendment 1314 #
Proposal for a regulation Article 17 – paragraph 1 – point d (d) the specific service in respect of which the targeted detection order is issued;
Amendment 1315 #
Proposal for a regulation Article 17 – paragraph 2 Amendment 1316 #
Proposal for a regulation Article 17 – paragraph 3 Amendment 1317 #
Proposal for a regulation Article 17 – paragraph 4 Amendment 1318 #
Proposal for a regulation Article 17 – paragraph 5 Amendment 1319 #
Proposal for a regulation Article 17 – paragraph 6 Amendment 1325 #
Proposal for a regulation Article 19 Amendment 1327 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation,
Amendment 1328 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services and hotlines shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements.
Amendment 1329 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing
Amendment 1330 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing
Amendment 1331 #
Proposal for a regulation Article 19 – paragraph 1 Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing,
Amendment 1332 #
Proposal for a regulation Article 19 a (new) Article19a Respect to Privacy Nothing in this Regulation shall be interpreted as a requirement to 1. break cryptography; 2. scan content on users’ devices; 3. restrict anonymous access to online services and software applications.
Amendment 1333 #
Proposal for a regulation Article 20 – title 20
Amendment 1334 #
Proposal for a regulation Article 20 – title Victims’ right to information and support
Amendment 1336 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Amendment 1337 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Amendment 1338 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they reside, information regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them. . The information shall be provided to the persons requesting it in a confidential, easily understandable and accessible manner.
Amendment 1339 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Amendment 1340 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Amendment 1341 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 Amendment 1342 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 1 a (new) The Coordinating Authority shall ensure that survivors, including child survivors and parents of child survivors, are informed about survivor support services where the survivors can receive age- appropriate and gender-sensitive information and support.
Amendment 1343 #
Proposal for a regulation Article 20 – paragraph 1 – subparagraph 2 That Coordinating Authority shall transmit the request to the EU Centre through the
Amendment 1344 #
Proposal for a regulation Article 20 – paragraph 1 a (new) 1a. Victims of child sexual abuse or their representatives and persons living in the Union shall have the right to receive, upon their request, from the Coordinating Authority information regarding victim’s rights, support and assistance.The information shall be age-appropriate, accessible and gender-sensitive and shall include at a minimum: (a) the type of support they can obtain and from whom, including, where relevant, basic information about access to medical support, any specialist support, including psychological or social support, and alternative accommodation; (b) the procedures for making complaints with regard to a criminal offence and their role in connection with such procedures; (c) how and under what conditions they can obtain protection, including protection measures; (d) how and under what conditions they can access legal advice, legal aid and any other sort of advice; (e) how and under what conditions they can access compensation; (f) how and under what conditions they are entitled to interpretation and translation.
Amendment 1345 #
Proposal for a regulation Article 20 – paragraph 1 b (new) 1b. In case a victim or victim representative indicates the preference for a periodic request, the Coordinating Authority shall submit, without delay, the information referred to in paragraph 3 proactively to the requester after the first submitted reply, in any new instances of reports referred to in paragraph 1 on a weekly basis. Victims or victim representatives may terminate the periodic request at any time by notifying the Coordinating Authority in question.
Amendment 1346 #
Proposal for a regulation Article 20 – paragraph 2 – point b (b) where applicable, the individual or entity
Amendment 1347 #
Proposal for a regulation Article 20 – paragraph 2 – point c (c) sufficient elements to
Amendment 1348 #
Proposal for a regulation Article 20 – paragraph 2 – point c a (new) (ca) an indication if the request is occasional or covers a certain time period.
Amendment 1349 #
Proposal for a regulation Article 20 – paragraph 3 – point d (d) whether the provider reported having removed or disabled access to the material, in accordance with Article 13(1),
Amendment 1350 #
Proposal for a regulation Article 20 – paragraph 3 – point d (d) whether the provider reported having removed
Amendment 1351 #
Proposal for a regulation Article 20 – paragraph 3 – point d a (new) (da) information regarding age- appropriate and gender-sensitive survivor support services to provide the child, family and survivors with adequate emotional and psychosocial support as well as practical and legal assistance.
Amendment 1352 #
Proposal for a regulation Article 20 – paragraph 3 – point d a (new) (da) if there were appeals to such removal, and in that case, all related information
Amendment 1353 #
Proposal for a regulation Article 20 – paragraph 3 – point d b (new) (db) relevant age-appropriate, accessible and gender-sensitive information on victim support and assistance in the victim’s region.
Amendment 1354 #
Proposal for a regulation Article 21 – title Amendment 1355 #
Proposal for a regulation Article 21 – title Amendment 1356 #
Proposal for a regulation Article 21 – paragraph 1 Amendment 1357 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known or new child sexual abuse material depicting them removed or to have access
Amendment 1358 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide
Amendment 1359 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of
Amendment 1360 #
Proposal for a regulation Article 21 – paragraph 1 1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed
Amendment 1361 #
Proposal for a regulation Article 21 – paragraph 1 a (new) 1a. Each Member State shall ensure the functioning of hotlines, including through funding and capacity building, in order for victims and their families to receive support from the competent authority in a timely manner.
Amendment 1362 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Amendment 1363 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Amendment 1364 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services
Amendment 1365 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Amendment 1366 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove
Amendment 1367 #
Proposal for a regulation Article 21 – paragraph 2 – subparagraph 1 Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove
Amendment 1368 #
Proposal for a regulation Article 21 – paragraph 2 – point 1 (new) (1) The Member States shall provide for a support fund for victims of abuse. The fund shall provide legal assistance and shall be activated only once the EU Centre has proved an effective violation within the meaning of Article 1 of this regulation.
Amendment 1369 #
Proposal for a regulation Article 21 – paragraph 3 3. The requests referred to in paragraphs 1 and 2 shall indicate the relevant item or items of child sexual abuse material and any other relevant information.
Amendment 1370 #
Proposal for a regulation Article 21 – paragraph 4 – point b (b) verifying whether the provider removed
Amendment 1371 #
Proposal for a regulation Article 21 – paragraph 4 – point b (b) verifying whether the provider removed
Amendment 1372 #
Proposal for a regulation Article 21 – paragraph 4 – point d (d) where necessary, informing the Coordinating Authority of establishment of the presence of that item or those items on the provider's service, with a view to the issuance of a removal order pursuant to Article 14
Amendment 1373 #
Proposal for a regulation Article 21 – paragraph 4 – point d a (new) (da) information regarding victim’s rights, assistance and support pursuant to Article 21.
Amendment 1374 #
Proposal for a regulation Article 21 a (new) Article21a Right to lodge a complaint with a supervisory authority 1. Without prejudice to any other administrative or judicial remedy, every user shall have the right to lodge a complaint with a supervisory authority, in particular in the Member State of his or her habitual residence, place of work or place of the alleged infringement if the user considers that the processing of personal data relating to him or her infringes this Regulation or Regulation (EU) 2016/679. 2. The supervisory authority with which the complaint has been lodged shall inform the complainant on the progress and the outcome of the complaint including the possibility of a judicial remedy pursuant to Article 21b.
Amendment 1375 #
Proposal for a regulation Article 21 b (new) Article21b Right to an effective judicial remedy against a provider of a hosting services or a providers of a number-independent interpersonal communications service 1. Without prejudice to any available administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority pursuant to 21a, each user shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation or Regulation (EU) 2016/679. 2. Proceedings against a provider of a hosting service or a provider of a number- independent interpersonal communications service shall be brought before the courts of the Member State where the provider has an establishment. Alternatively, such proceedings may be brought before the courts of the Member State where the user has his or her habitual residence.
Amendment 1376 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 1 – introductory part Providers of hosting services and providers of number-independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
Amendment 1377 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 1 – introductory part Providers of hosting services and providers
Amendment 1378 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 1 – introductory part Providers of hosting services and providers of number-independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
Amendment 1379 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 1 – introductory part Providers of hosting services and providers of number independent interpersonal communications services
Amendment 1380 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 1 – point a (a) executing a
Amendment 1381 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 1 – point c Amendment 1382 #
(e) responding to requests issued by competent law enforcement authorities and judicial authorities in accordance with the applicable law, with a view to providing them with the necessary information for the prevention, detection, investigation or prosecution of child sexual abuse offences, insofar as the content data and other data relate to a report that the provider has submitted to the EU Centre pursuant to Article 12. All such requests shall be logged.
Amendment 1383 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 2 Amendment 1384 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 2 Amendment 1385 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 2 Amendment 1386 #
Proposal for a regulation Article 22 – paragraph 1 – subparagraph 2 As regards the first subparagraph, point (a), the provider
Amendment 1387 #
Proposal for a regulation Article 22 – paragraph 2 – subparagraph 1 Providers shall securely preserve the information referred to in paragraph 1 for no longer than necessary for the applicable purpose and, in any event, no longer than 12 months from the date of the reporting or of the removal or disabling of access, whichever occurs first.
Amendment 1388 #
Proposal for a regulation Article 22 – paragraph 2 – subparagraph 1 Providers shall preserve the information referred to in paragraph 1 for no longer than necessary for the applicable purpose and, in any event, no longer than 12 months from the date of the reporting or of the removal
Amendment 1389 #
Proposal for a regulation Article 22 – paragraph 2 – subparagraph 3 Providers shall ensure that the information referred to in paragraph 1 is preserved in a
Amendment 1390 #
Proposal for a regulation Article 22 – paragraph 2 – subparagraph 3 Providers shall ensure that the information referred to in paragraph 1 is preserved in a secure manner and that the preservation is subject to appropriate technical and organisational safeguards. Those safeguards shall ensure, in particular, that the information can be accessed and processed only for the purpose for which it is preserved, that a high level of security is achieved, all access to the data is logged, and that the information is deleted upon the expiry of the applicable time periods for preservation. Providers shall regularly review those safeguards and adjust them where necessary.
Amendment 1391 #
Proposal for a regulation Article 23 – paragraph 1 1.
Amendment 1392 #
Proposal for a regulation Article 24 – paragraph 3 3. The provider shall mandate its legal representatives to be addressed in addition to or instead of the provider by the Coordinating Authorities, other competent authorities of the Member States and the Commission on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation
Amendment 1393 #
Proposal for a regulation Article 24 a (new) Article24a Anonymous public reporting of online child sexual abuse 1. Member States shall ensure that the public has the possibility to anonymously report child sexual abuse material and child sexual exploitation activities to recognised non-governmental organisations specialised in combatting online child sexual abuse material. 2. Member States shall ensure that hotlines operating in their territory are authorised to view, assess and process anonymous reports of child sexual abuse material. 3. Member States shall grant hotlines the authority to issue content removal notices for confirmed instances of child sexual abuse material. 4. Member States shall authorise hotlines to voluntarily conduct pro-active searching for child sexual abuse material online.
Amendment 1394 #
Proposal for a regulation Article 25 – paragraph 1 1. Member States shall, by [Date - two months from the date of entry into force of this Regulation], designate one or more competent authorities as responsible for the application and enforcement of this Regulation and to the achievement of the objective of this Regulation and enforcement of Directive 2011/93/EU (‘competent authorities’).
Amendment 1395 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 1 Where Member States
Amendment 1396 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 2 The Coordinating Authority shall be responsible for all matters related to application and enforcement of this Regulation in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities. The Coordinating Authority shall also be responsible for the coordination and adaptation of prevention techniques, elaborated by the EU Centre. The Coordinating Authority shall issue recommendations and good practices on improving digital skills and competences, including media literacy, amongst the population through the realization of awareness campaigns on a national level, targeting in particular parents and children on the detection and prevention of child sexual abuse online.
Amendment 1397 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 2 The Coordinating Authority shall be responsible for all matters related to application and enforcement of this Regulation, and to the achievement of the objective of this Regulation and enforcement of Directive 2011/93/EU in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities.
Amendment 1398 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 3 The Coordinating Authority shall in any event be responsible for ensuring coordination and overseeing the implementation at national level in respect of those matters, including issues related to prevention, education and awareness raising and the organisation of regular training activities for officials, including in law enforcement authorities who deal with cases which involve children, and for contributing to the effective, efficient and consistent application and enforcement of this Regulation throughout the Union.
Amendment 1399 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 3 The Coordinating Authority shall in any event be responsible for ensuring coordination at national level in respect of those matters including issues related to prevention, education and awareness raising and the organisation of regular training activities for officials, including in law enforcement authorities who deal with cases which involve children and for contributing to the effective, efficient and consistent application and enforcement of this Regulation throughout the Union.
Amendment 1400 #
Proposal for a regulation Article 25 – paragraph 2 – subparagraph 3 The Coordinating Authority shall in any event be responsible for ensuring coordination at national level in respect of those matters, including matters related to prevention, and for contributing to the effective, efficient and consistent application and enforcement of this Regulation and Directive 2011/93/EU throughout the Union.
Amendment 1401 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a sufficiently staffed contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters
Amendment 1402 #
Proposal for a regulation Article 25 – paragraph 5 5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement
Amendment 1403 #
Proposal for a regulation Article 25 – paragraph 6 6. Within two weeks after the designation of the Coordinating Authorities pursuant to paragraph 2, the EU Centre shall set up an online public register listing the Coordinating Authorities and their contact points. The EU Centre shall regularly publish any modification thereto.
Amendment 1404 #
Proposal for a regulation Article 25 – paragraph 7 – point a Amendment 1405 #
Proposal for a regulation Article 25 – paragraph 7 – point a (a) provide certain information o
Amendment 1406 #
Proposal for a regulation Article 25 – paragraph 7 – point a a (new) (aa) provide information and expertise on gender-sensitive and age appropriate victim support and prevention of online child sexual abuse.
Amendment 1407 #
Proposal for a regulation Article 25 – paragraph 7 – point b Amendment 1408 #
Proposal for a regulation Article 25 – paragraph 7 – point b (b) assist in assessing, in accordance with Article 5(2), the risk assessment conducted or updated or the mitigation measures taken by a provider of hosting or number-independent interpersonal communication services under the jurisdiction of the Member State that designated the requesting Coordinating Authority;
Amendment 1409 #
Proposal for a regulation Article 25 – paragraph 7 – point c Amendment 1410 #
Proposal for a regulation Article 25 – paragraph 7 – point c (c) verify the possible need to request competent national authorities to issue a
Amendment 1411 #
Proposal for a regulation Article 25 – paragraph 7 – point c (c) verify the possible need to request competent national authorities to issue a detection
Amendment 1412 #
Proposal for a regulation Article 25 – paragraph 7 – point c (c) verify the possible need to request competent national authorities to issue a detection order, a removal order
Amendment 1413 #
Proposal for a regulation Article 25 – paragraph 7 – point d Amendment 1414 #
Proposal for a regulation Article 25 – paragraph 7 – point d (d) verify the effectiveness of a
Amendment 1415 #
Proposal for a regulation Article 25 – paragraph 7 – point d a (new) (da) provide knowledge and expertise on appropriate prevention techniques tailored by age and gender against online solicitation of children and the dissemination of child sexual abuse material online.
Amendment 1416 #
Proposal for a regulation Article 25 – paragraph 8 8. The EU Centre shall provide such assistance free of charge and in accordance with its tasks and obligations under this Regulation
Amendment 1417 #
Proposal for a regulation Article 25 – paragraph 8 a (new) 8a. The EU Centre shall support Member States in designing preventive and gender-sensitive measures, such as awareness-raising campaigns to combat child sexual abuse, guaranteeing comprehensive sexuality and relationships education in all schools, introducing digital skills, literacy and safety online programs in formal education, ensuring the full availability of specialized support services tailored by gender and age for child survivors of sexual abuse and children in vulnerable situations.
Amendment 1418 #
Proposal for a regulation Article 25 – paragraph 9 a (new) 9a. In its contact with survivors or in any decision affecting survivors, the Coordinating Authority shall operate in an age-appropriate and gender-sensitive way that minimises risks to survivors, especially children, addresses harm of survivors and meets their needs. It shall operate in a victim and gender sensitive manner which prioritises recognising and listening to the survivor, avoids secondary victimisation and retraumatisation, and systematically focuses on their safety, rights, well-being, expressed needs and choices, and ensures they are treated in an empathetic, sensitive and non- judgmental way.
Amendment 1419 #
Proposal for a regulation Article 25 – paragraph 9 a (new) 9a. In its engagement with victims and survivors or in any decision affecting victims and survivors, the Coordination Autority shall operate in a way that minimises risks to victims and survivors, especially children.
Amendment 1420 #
Proposal for a regulation Article 25 a (new) Article25a Cooperation with third parties Where necessary for the performance of its tasks under this Regulation, including the achievement of the objective of this regulation, and in order to promote the generation and sharing of knowledge in line with article 43 (6), the Coordinating Authority shall cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations and semi-public organisations and practitioners.
Amendment 1421 #
Proposal for a regulation Article 26 – paragraph 1 1. Member States shall ensure that the Coordinating Authorities that they designated perform their tasks under this Regulation in an objective, impartial, transparent and timely manner, while fully respecting
Amendment 1422 #
Proposal for a regulation Article 26 – paragraph 2 – point c (c) are free from any undue external influence, whether direct or indirect, it being understood that (a) the receipt of any type of financial aid by the Coordinating Authority and (b) the membership of the Coordinating Authority in a recognised international network shall not prejudice its independent character;
Amendment 1423 #
Proposal for a regulation Article 26 – paragraph 2 – point c (c) are free from any undue external influence, whether direct or indirect in line with their national legislation;
Amendment 1424 #
Proposal for a regulation Article 26 – paragraph 2 – point e Amendment 1425 #
Proposal for a regulation Article 26 – paragraph 2 – point e (e)
Amendment 1426 #
Proposal for a regulation Article 26 – paragraph 4 4. The Coordinating Authorities shall ensure that relevant members of staff have the required qualifications, experience and technical skills to perform their duties
Amendment 1427 #
Proposal for a regulation Article 26 – paragraph 5 5.
Amendment 1428 #
Proposal for a regulation Article 27 – paragraph 1 – introductory part 1.
Amendment 1429 #
Proposal for a regulation Article 27 – paragraph 1 – point a Amendment 1430 #
Proposal for a regulation Article 27 – paragraph 1 – point a (a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may
Amendment 1431 #
Proposal for a regulation Article 27 – paragraph 1 – point b Amendment 1432 #
Proposal for a regulation Article 27 – paragraph 1 – point b (b) the power to carry out, or to request an independent judicial authority in their Member State to order remote or on-site inspections of any premises that those providers or the other persons referred to in point (a) use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain
Amendment 1433 #
Proposal for a regulation Article 27 – paragraph 1 – point c Amendment 1434 #
Proposal for a regulation Article 27 – paragraph 1 – point c (c) in accordance with national legislation, the power to ask any member of staff or representative of those providers or the other persons referred to in point (a) to give explanations in respect of any information relating to a suspected infringement of this Regulation and to record the answers;
Amendment 1435 #
Proposal for a regulation Article 27 – paragraph 1 – point d (d) the power to request information,
Amendment 1436 #
Proposal for a regulation Article 27 – paragraph 1 – point d (d) the power to request information from the service provider, including to assess whether the measures taken to execute a
Amendment 1437 #
Proposal for a regulation Article 27 – paragraph 2 Amendment 1438 #
Proposal for a regulation Article 28 – paragraph 1 – introductory part 1.
Amendment 1439 #
Proposal for a regulation Article 28 – paragraph 1 – point a Amendment 1440 #
Proposal for a regulation Article 28 – paragraph 1 – point b (b) the power to order the cessation of infringements of this Regulation
Amendment 1441 #
Proposal for a regulation Article 28 – paragraph 1 – point b (b) the power to order specific measures to bring about the cessation of infringements of this Regulation and
Amendment 1442 #
Proposal for a regulation Article 28 – paragraph 1 – point c (c) the power to impose fines, or request a judicial authority in their Member State to do so, in accordance with Article 35 for infringements of this Regulation
Amendment 1443 #
Proposal for a regulation Article 28 – paragraph 1 – point c (c) the power to impose fines
Amendment 1444 #
Proposal for a regulation Article 28 – paragraph 1 – point e (e) the power to adopt appropriate, reasonable, and proportionate interim measures to
Amendment 1445 #
Proposal for a regulation Article 28 – paragraph 2 Amendment 1446 #
Proposal for a regulation Article 28 – paragraph 3 Amendment 1447 #
Proposal for a regulation Article 28 – paragraph 4 Amendment 1449 #
Proposal for a regulation Article 29 – paragraph 1 – introductory part 1.
Amendment 1450 #
Proposal for a regulation Article 29 – paragraph 1 – point b (b) the infringement persists
Amendment 1451 #
Proposal for a regulation Article 29 – paragraph 2 – point a – point i (i) adopt and submit an action plan setting out the necessary measures to terminate the infringement, subject to the approval of the Coordinating Authority;
Amendment 1452 #
Proposal for a regulation Article 29 – paragraph 2 – point b – introductory part (b) request the competent judicial authority
Amendment 1453 #
Proposal for a regulation Article 29 – paragraph 2 – point b – point ii (ii) the infringement persists and causes serious harm that is greater than the likely harm to users relying on the service for legal purposes and;
Amendment 1454 #
Proposal for a regulation Article 29 – paragraph 4 – subparagraph 2 The temporary restriction shall apply for a period of four weeks, subject to the possibility for the competent judicial authority,
Amendment 1455 #
Proposal for a regulation Article 29 – paragraph 4 – subparagraph 3 – point a (a) the provider has failed to take
Amendment 1456 #
Proposal for a regulation Article 30 – paragraph 1 1. The measures taken by the Coordinating Authorities in the exercise of their investigatory and enforcement powers referred to in Articles 27
Amendment 1457 #
Proposal for a regulation Article 30 – paragraph 2 2. Member States shall ensure that any exercise of the investigatory and enforcement powers referred to in Articles 27, 28 and 29 is subject to adequate safeguards laid down in the applicable national law to respect the fundamental rights of all parties affected. In particular, those measures shall
Amendment 1458 #
Proposal for a regulation Article 30 – paragraph 2 2. Member States shall ensure that any exercise of the investigatory and enforcement powers referred to in Articles 27
Amendment 1459 #
Proposal for a regulation Article 31 – paragraph 1 Coordinating Authorities shall have the power to carry out searches on publicly accessible material on hosting services to detect the dissemination of known
Amendment 1460 #
Proposal for a regulation Article 31 – paragraph 1 Coordinating Authorities shall have the power to carry out searches on publicly accessible material on hosting services to detect the dissemination of known
Amendment 1461 #
Proposal for a regulation Article 32 Amendment 1462 #
Proposal for a regulation Article 32 – paragraph 1 Coordinating Authorities shall have the power to notify providers of hosting services under the jurisdiction of the Member State that designated them
Amendment 1463 #
Proposal for a regulation Article 33 – paragraph 2 – subparagraph 2 Where a provider which does not have its main establishment in the Union failed to appoint a legal representative in accordance with Article 24, all Member States shall have jurisdiction. Where a Member State decides to exercise jurisdiction under this subparagraph, it shall inform all other Member States and ensure that the principle of ne bis in idem is respected.
Amendment 1464 #
Proposal for a regulation Article 34 – paragraph 1 1. Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to lodge a complaint alleging an infringement of this Regulation affecting them against providers of relevant information society
Amendment 1465 #
Proposal for a regulation Article 34 – paragraph 1 1. Users shall have the right to lodge a complaint alleging an infringement of this Regulation affecting them against providers of relevant information society services with the Coordinating Authority designated by the Member State
Amendment 1466 #
Proposal for a regulation Article 34 – paragraph 1 a (new) 1a. During these proceedings, both parties shall have the right to be heard and receive appropriate information about the status of the complaint, in accordance with national law
Amendment 1467 #
Proposal for a regulation Article 34 – paragraph 1 b (new) 1b. The Coordinating authority shall offer easy to use mechanisms to anonymously submit information about infringements of this Regulation.
Amendment 1468 #
Proposal for a regulation Article 34 – paragraph 2 2. Coordinating Authorities shall provide
Amendment 1469 #
Proposal for a regulation Article 34 – paragraph 3 a (new) 3a. Users shall have the possibility to lodge a complaint alleging an infringement of this Regulation against providers of information society services with recognised non-governmental organisations specialised in combatting online child sexual abuse material, including the hotlines.
Amendment 1470 #
Proposal for a regulation Article 34 a (new) Article34a Representative actions The following is added to Annex I of Directive (EU) 2020/1828 on Representative actions for the protection of the collective interests of consumers: “Regulation xxxx/xxxx of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse”
Amendment 1471 #
Proposal for a regulation Article 34 a (new) Article34a Reporting of breaches and protection of reporting persons Directive (EU) 2019/1937 of the European Parliament and of the Council shall apply to the reporting of breaches of this Regulation and the protection of persons reporting such breaches.
Amendment 1472 #
Proposal for a regulation Article 35 – paragraph 2 2. Member States shall ensure that the maximum amount of penalties imposed for an infringement of this Regulation shall not exceed 6 % of the annual
Amendment 1473 #
Proposal for a regulation Article 35 – paragraph 3 3. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information or to submit to an on-site inspection shall not exceed
Amendment 1474 #
Proposal for a regulation Article 35 – paragraph 3 3. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information or to submit to an on-site inspection shall not exceed 1% of the annual
Amendment 1475 #
Proposal for a regulation Article 35 – paragraph 4 4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily global turnover of the provider
Amendment 1476 #
Proposal for a regulation Article 35 – paragraph 4 4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily
Amendment 1477 #
Proposal for a regulation Article 35 – paragraph 4 a (new) 4a. Member States shall ensure that penalties imposed for the infringement of this Regulation do not encourage the over reporting or the removal of material which does not constitute child sexual abuse material.
Amendment 1478 #
Proposal for a regulation Article 35 – paragraph 4 a (new) 4a. Member States shall ensure that penalties imposed for the infringement of this Regulation do not encourage the over reporting or the removal of material which does not constitute child sexual abuse material.
Amendment 1479 #
Proposal for a regulation Article 35 a (new) Article35a Compensation Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to seek, in accordance with Union and national law, compensation from providers of relevant information society services, for any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.
Amendment 1480 #
Proposal for a regulation Article 35 a (new) Article35a Compensation Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to seek, in accordance with Union and national law, compensation from providers of relevant information society services, for any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.
Amendment 1481 #
Proposal for a regulation Article 36 – title Identification and submission of
Amendment 1482 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point a (a) anonymised specific items of material
Amendment 1483 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point a (a) specific items of material
Amendment 1484 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point a (a) specific items of material
Amendment 1485 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point b Amendment 1486 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point b (b)
Amendment 1487 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 1 – point b (b) exact uniform resource locators indicating specific items of material that Coordinating Authorities or that competent judicial authorities or other independent administrative authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material, hosted by providers of hosting services not offering services in the Union, that cannot be removed due to those providers’ refusal to remove
Amendment 1488 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 2 Member States shall take the necessary measures to ensure that the Coordinating Authorities that they designated receive, without undue delay,
Amendment 1489 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 2 Member States shall take the necessary measures to ensure that the Coordinating Authorities that they designated receive, without undue delay, the material identified as child sexual abuse material,
Amendment 1490 #
Proposal for a regulation Article 36 – paragraph 1 – subparagraph 2 Member States shall take the necessary measures to ensure that the Coordinating Authorities that they designated receive, without undue delay, the material identified as child sexual abuse material,
Amendment 1491 #
Proposal for a regulation Article 36 – paragraph 2 Amendment 1492 #
Proposal for a regulation Article 36 – paragraph 3 3. Member States shall ensure that, where their law enforcement authorities receive a report
Amendment 1493 #
Proposal for a regulation Article 36 – paragraph 3 3. Member States shall ensure that, where their law enforcement authorities receive a report of the dissemination of new child sexual abuse material
Amendment 1494 #
Proposal for a regulation Article 36 – paragraph 4 4. They shall also ensure that, where the diligent assessment indicates that the material does not constitute child sexual abuse material
Amendment 1495 #
Proposal for a regulation Article 36 – paragraph 4 4. They shall also ensure that, where the diligent assessment indicates that the material does not constitute child sexual abuse material
Amendment 1496 #
Proposal for a regulation Article 36 – paragraph 4 4. They shall also ensure that, where the diligent assessment indicates that the material does not constitute child sexual abuse material
Amendment 1497 #
Proposal for a regulation Article 37 – paragraph 1 – subparagraph 2 Amendment 1498 #
Proposal for a regulation Article 37 – paragraph 1 – subparagraph 2 Where
Amendment 1499 #
Proposal for a regulation Article 37 – paragraph 1 – subparagraph 2 a (new) No action shall be taken without a decision from a court of law in the Member State where the provider of the relevant information is located;
Amendment 1500 #
Proposal for a regulation Article 37 – paragraph 2 – introductory part 2. The request
Amendment 1501 #
Proposal for a regulation Article 37 – paragraph 2 – point b (b) a description of the relevant facts, the provisions of this Regulation concerned and the reasons why the Coordinating Authority that sent the request
Amendment 1502 #
Proposal for a regulation Article 37 – paragraph 2 – point c (c) any other information that the Coordinating Authority that sent the request
Amendment 1503 #
Proposal for a regulation Article 37 – paragraph 2 – point c (c) any other information that the Coordinating Authority that sent the request, or the Commission, considers relevant, including, where appropriate, information gathered on its own initiative
Amendment 1504 #
Proposal for a regulation Article 37 – paragraph 3 – subparagraph 1 The Coordinating Authority of establishment shall assess the suspected infringement, taking into utmost account the request
Amendment 1505 #
Proposal for a regulation Article 37 – paragraph 3 – subparagraph 1 The Coordinating Authority of establishment shall assess the suspected infringement, taking into utmost account the request
Amendment 1506 #
Proposal for a regulation Article 37 – paragraph 3 – subparagraph 2 Where it considers that it has insufficient information to asses the suspected infringement or to act upon the request
Amendment 1507 #
Where it considers that it has insufficient information to assess the suspected infringement or to act upon the request
Amendment 1508 #
Proposal for a regulation Article 37 – paragraph 4 4. The Coordinating Authority of establishment shall, without undue delay and in any event not later than two months following receipt of the request
Amendment 1509 #
Proposal for a regulation Article 37 – paragraph 4 4. The Coordinating Authority of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation referred to in paragraph 1, communicate to the Coordinating Authority that sent the request, or the Commission, the outcome of its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and, where applicable,
Amendment 1510 #
Proposal for a regulation Article 38 – paragraph 1 – subparagraph 1 Coordinating Authorities shall share best practice standards and guidance on the detection and removal of child sexual abuse material and may participate in joint investigations, which may be coordinated with the support of the EU Centre, of matters covered by this Regulation, concerning providers of relevant information society services that offer their services in several Member States. Those joint investigations shall also take place on the darkweb.
Amendment 1511 #
Proposal for a regulation Article 38 – paragraph 1 – subparagraph 1 Coordinating Authorities shall share best practice standards and guidance on the detection and removal of child sexual abuse material and may participate in joint investigations, which may be coordinated with the support of the EU Centre, of matters covered by this Regulation, concerning providers of relevant information society services that offer their services in several Member States.
Amendment 1512 #
Proposal for a regulation Article 38 – paragraph 1 a (new) 1a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to remove content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
Amendment 1513 #
Proposal for a regulation Article 38 – paragraph 2 2. The participating Coordinating Authorities shall make the results of the joint investigations available to other Coordinating Authorities
Amendment 1514 #
Proposal for a regulation Article 38 – paragraph 2 a (new) 2a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of online child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to remove content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
Amendment 1515 #
Proposal for a regulation Article 39 – paragraph 1 1. Coordinating Authorities shall cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and other relevant Union agencies
Amendment 1516 #
Proposal for a regulation Article 39 – paragraph 1 1. Coordinating Authorities shall cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority
Amendment 1517 #
Proposal for a regulation Article 39 – paragraph 1 1. Coordinating Authorities shall cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and other relevant Union agencies,
Amendment 1518 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall
Amendment 1519 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems with highest cybersecurity standards supporting communications between Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services.
Amendment 1520 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one
Amendment 1521 #
Proposal for a regulation Article 39 – paragraph 2 2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, the Commission, the EU Centre, hotlines, other relevant Union agencies and providers of relevant information society services.
Amendment 1522 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation. Regulation (EU) [Joint Investigation Teams online collaboration platform] shall apply mutatis mutandis.
Amendment 1523 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities, the Commission, the EU Centre, hotlines, other relevant Union agencies and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.
Amendment 1524 #
Proposal for a regulation Article 39 – paragraph 3 3. The Coordinating Authorities,
Amendment 1525 #
Proposal for a regulation Article 39 – paragraph 3 a (new) Amendment 1526 #
Proposal for a regulation Article 39 a (new) Article39a Independence The Commission shall ensure in the draft general budget of the Union that the European Data Protection Board and European Data Protection Supervisor are provided with sufficient human, technical and financial resources, premises and infrastructure necessary for the effective performance of its tasks and exercise of its powers pursuant to this Regulation.
Amendment 1527 #
Proposal for a regulation Chapter IV – title IV
Amendment 1528 #
Proposal for a regulation Chapter IV – title IV EU CENTRE TO PR
Amendment 1529 #
Proposal for a regulation Article 40 – title Establishment and scope of action of the
Amendment 1530 #
1. A
Amendment 1531 #
Proposal for a regulation Article 40 – paragraph 1 1. A European Union Agency to pr
Amendment 1532 #
Proposal for a regulation Article 40 – paragraph 1 a (new) 1a. The EU Center must be completely independent from Europol.
Amendment 1533 #
Proposal for a regulation Article 40 – paragraph 2 2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online. Its remit and powers shall not be expanded without prior evaluation and unanimous decision by Member States.
Amendment 1534 #
Proposal for a regulation Article 40 – paragraph 2 2. The
Amendment 1535 #
Proposal for a regulation Article 40 – paragraph 2 2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting
Amendment 1536 #
Proposal for a regulation Article 40 – paragraph 2 2. The EU Centre shall contribute to the achievement of the objectives of this Regulation by supporting and facilitating the implementation of its provisions concerning the
Amendment 1537 #
Proposal for a regulation Article 41 – paragraph 1 1. The
Amendment 1538 #
Proposal for a regulation Article 41 – paragraph 2 2. In each of the Member States the EU Centre shall
Amendment 1539 #
Proposal for a regulation Article 42 – paragraph 1 The
Amendment 1540 #
Proposal for a regulation Article 42 – paragraph 1 The seat of the EU Centre shall be
Amendment 1541 #
Proposal for a regulation Article 42 – paragraph 1 The choice of the location of the seat of the EU Centre shall be
Amendment 1542 #
Proposal for a regulation Article 42 – paragraph 1 . The choice of the location of the seat of the EU Centre shall be
Amendment 1543 #
Proposal for a regulation Article 42 – paragraph 1 The seat of the EU Centre shall be
Amendment 1544 #
Proposal for a regulation Article 42 – paragraph 1 The
Amendment 1545 #
Proposal for a regulation Article 43 – title 43 Tasks of the EU Centre on Child Protection
Amendment 1546 #
Proposal for a regulation Article 43 – paragraph -1 (new) Amendment 1547 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point a (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(
Amendment 1548 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point a (a) supporting the Commission and European Data Protection Board in the preparation of the guidelines referred to in Article 3(8), Article 4(5),
Amendment 1549 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point a (a)
Amendment 1550 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point b Amendment 1551 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point b Amendment 1552 #
Proposal for a regulation Article 43 – paragraph 1 – point 1 – point b a (new) (ba) operating accounts, including child accounts, on publicly available number-independent interpersonal communications services and reporting relevant findings concerning the risk of solicitation of children to the Coordinating Authority of establishment; where the Centre becomes aware of potential online child sexual abuse, Article 48(3) of this Regulation shall apply mutatis mutandis;
Amendment 1553 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 Amendment 1554 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 Amendment 1555 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 – point b (b) maintaining and operating the databases of indicators
Amendment 1556 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 – point c (c) giving providers of hosting services and providers of number-independent interpersonal communications services that received a detection order access to the relevant databases of indicators in accordance with Article 46;
Amendment 1557 #
Proposal for a regulation Article 43 – paragraph 1 – point 2 – point c (c) giving providers of hosting services and providers of number independent interpersonal communications services that received a detection order access to the relevant databases of indicators in accordance with Article 46;
Amendment 1558 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – introductory part (4) facilitate the removal process referred to in Section 4 of Chapter II
Amendment 1559 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – introductory part (4) facilitate the removal process referred to in Section 4 of Chapter II and the other processes referred to in Section
Amendment 1560 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point b Amendment 1561 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point b Amendment 1562 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point b Amendment 1563 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point c Amendment 1564 #
Amendment 1565 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point c Amendment 1566 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 – point d (d) providing information and support to
Amendment 1567 #
Proposal for a regulation Article 43 – paragraph 1 – point 4 a (new) (4a) conduct proactive searches of publicly accessible content on hosting services for known child sexual abuse material in accordance with Article 49;
Amendment 1568 #
Proposal for a regulation Article 43 – paragraph 1 – point 5 – introductory part (5) support the Coordinating Authorities
Amendment 1569 #
Proposal for a regulation Article 43 – paragraph 1 – point 5 – point c Amendment 1570 #
Proposal for a regulation Article 43 – paragraph 1 – point 5 – point e (e) assisting the Commission in the preparation of the delegated and implementing acts
Amendment 1571 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a (a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51, including education, awareness raising and intervention programmes, and facilitating the drafting of recommendations and guidelines on prevention and mitigation of child sexual abuse, in particular in the digital space and taking into account technological developments;
Amendment 1572 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a (a) collecting, recording, analysing and providing gender and age specific information, providing analysis based on anonymised and non-personal data gathering, including gender and age disaggregated data, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51;
Amendment 1573 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point a a (new) (aa) supporting awareness-raising and prevention campaigns in the Union carried out by public and private bodies, stakeholders and education institutions, and elaborating best practices in this regard;
Amendment 1574 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research and expertise on those matters and on assistance to victims, including by serving as a hub of expertise to support evidence-based policy and by linking researchers to practitioners;
Amendment 1575 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research and expertise on those matters and on assistance to
Amendment 1576 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b (b) supporting the development and dissemination of research and expertise on those matters and on assistance to
Amendment 1577 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (ba) providing technical expertise and promoting the exchange of best practices among Member States on raising awareness for the prevention of child sexual abuse online in formal and non- formal education. Such efforts shall be age-appropriate and gender-sensitive;
Amendment 1578 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (ba) contribute to the implementation of awareness campaigns as per the potential risks posed by the online environment to children, in order to equip them with adequate skills for detecting potential grooming and deceit, to ensure safe use of the internet by children;
Amendment 1579 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (ba) Supporting national authorities to develop age-appropriate awareness material for minors, including specific campaigns on how to avoid risks while navigating the internet.
Amendment 1580 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b a (new) (ba) Referring victims to the appropriate national child protection services;
Amendment 1581 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) (bb) assisting with expertise and knowledge in the development and implementation of teacher training across the Union, in order to vest teachers with the necessary skills for guiding children on safely using information society services and detecting potentially malicious behaviour online;
Amendment 1582 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) (bb) exchanging best practices among Coordinating Authorities regarding the available tools to reduce the risk of children becoming victims of sexual abuse and to provide specialized assistance to survivors, in an age-appropriate and gender-sensitive way.
Amendment 1583 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b b (new) (bb) Support national authorities to develop awareness raising material targeted to adults, including parents and educators.
Amendment 1584 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b c (new) (bc) supporting the collaboration of victim support services and elaborating best practices;
Amendment 1585 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b c (new) (bc) referring survivors to appropriate child protection services;
Amendment 1586 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point b d (new) (bd) supporting the exchange of law enforcement agencies and providers and elaborating best practices;
Amendment 1587 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point c a (new) Amendment 1588 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point c b (new) (cb) create and oversee an "EU hashing list of known child sexual abuse material" and modify the content of that list, independently and autonomously and free of political, government or industry influence or interference;
Amendment 1589 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point c c (new) (cc) develop, in accordance with the implementing act as referred to in Article 43a, the European Centralised Helpline for Abuse of Teenagers (eCHAT), interconnecting via effective interoperability the national hotline's helplines, allowing children to reach out 24/7 via a recognisable central helpline in an anonymous way in their own language and free of charge;
Amendment 1590 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point c d (new) (cd) dispose over the resources needed to develop, where possible, open source, hashing technology tools for small and medium sized relevant information society services to prevent the dissemination of known child sexual abuse material in publicly accessible content.
Amendment 1591 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point c e (new) (ce) coordinate sharing and filter of Suspicious Activity Reports on alleged "known child sexual abuse material", operating independently, autonomously, free of political, government or industry influence or interference and in full respect of fundamental rights, including privacy and data protection. [By 1 year after entry into force] the Commission shall adopt a delegated act laying down requirements for a Suspicious Activy Reports format, as referred to in this paragraph, and the differentiation between actionable and non-actionable Suspicious Activity Reports. This delegated act shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way.
Amendment 1592 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 – point c f (new) (cf) scan public servers and public communications channels for known child sexual abuse material, with proven technology, solely for the purposes of amending the EU Hashing List and flagging the content for removal to the service provider of the specific public server or public communications channel, without prejudice to Art. -3. The European Data Protection Board shall issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the purpose of scanning.
Amendment 1593 #
(6a) support Member States in designing preventive measures, such as awareness-raising campaigns to combat child sexual abuse, with a specific focus on girls and other prevalent demographics, including by: (a) Acting on behalf of victims in liaising with other relevant authorities of the Member States for reparations and all other victim support programmes; (b) Referring victims to the appropriate child protection services, and to pro bono legal support services; (c) Facilitating access to care qualified health support services, including mental health and psychological support;
Amendment 1594 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 a (new) (6a) support Member States in designing preventive measures, such as awarness-raising campaigns to combat child sexual abuse, with a specific focus on girls and other prevalent demographics, including by: a) acting on behalf of victims in liaising with other relevant authorities of the Member States for reparations and all other victim support programmes; b) referring victims to the appropriate child protection services, and to pro bono legal support services.
Amendment 1595 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 a (new) (6a) Establish mechanisms to listen to and incorporate the views of children in its work, in accordance with the Directive 2012/29/EU and the Charter of Fundamental Rights of the European Union.
Amendment 1596 #
Proposal for a regulation Article 43 – paragraph 1 – point 6 b (new) (6b) shall operate in a way that minimises risks to victims, especially children, when engaging with victims or in any decision affecting victims;
Amendment 1597 #
Proposal for a regulation Article 43 a (new) Article43a Implementing act for the interconnection of helplines 1. The national helpline referred to in Article 43 shall be interconnected via the European Centralised Helpline for Abuse of Teenagers (eCHAT) to be developed and operated by the EU Centre by ... [two years after the date of entry into force of this Regulation] 2. The Commission shall be empowered to adopt, by means of implementing acts, technical specifications and procedures necessary to provide for the interconnection of national hotlines' online chat systems via eCHAT in accordance with Article 43 with regard to: (a) the technical data necessary forthe eCHAT system to perform itsfunctions and the method of storage, useand protection of that technical data; (b) the common criteria according to which national helplines shall be available through the system of interconnection of helplines; (c) the technical details on how helplines shall be madeavailable; (d) the technical conditions of availability of services provided by the system of interconnection of helplines. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 5 of Regulation (EU) 182/2011. 3. When adopting the implementingacts referred to in paragraph 2, the Commission shall take into account proven technology and existing practices.
Amendment 1598 #
Proposal for a regulation Article 44 Amendment 1599 #
Proposal for a regulation Article 44 – paragraph 1 – introductory part 1. The EU Centre shall create, maintain and operate databases of the following
Amendment 1600 #
Proposal for a regulation Article 44 – paragraph 1 – introductory part 1. The EU Centre shall create, maintain and operate databases of the following three types of indicators of
Amendment 1601 #
Proposal for a regulation Article 44 – paragraph 1 – point a (a) indicators to detect
Amendment 1602 #
Proposal for a regulation Article 44 – paragraph 1 – point b Amendment 1603 #
Proposal for a regulation Article 44 – paragraph 1 – point b Amendment 1604 #
Amendment 1605 #
Proposal for a regulation Article 44 – paragraph 1 – point c Amendment 1606 #
Proposal for a regulation Article 44 – paragraph 1 – point c Amendment 1607 #
Proposal for a regulation Article 44 – paragraph 1 – point c Amendment 1608 #
Proposal for a regulation Article 44 – paragraph 2 – point a (a) relevant indicators, consisting of digital identifiers to be used to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, on hosting services and number independent interpersonal communications services, generated by the EU Centre in accordance with paragraph 3;
Amendment 1609 #
Proposal for a regulation Article 44 – paragraph 2 – point a (a) relevant indicators, consisting of digital identifiers to be used to detect
Amendment 1610 #
Proposal for a regulation Article 44 – paragraph 2 – point a (a) relevant indicators, consisting of digital identifiers to be used to detect the dissemination of known
Amendment 1611 #
Proposal for a regulation Article 44 – paragraph 2 – point b Amendment 1612 #
Proposal for a regulation Article 44 – paragraph 2 – point c (c) the necessary additional information to facilitate the use of the indicators in accordance with this Regulation, including identifiers allowing for a distinction between images
Amendment 1613 #
Proposal for a regulation Article 44 – paragraph 2 – point c (c) the necessary additional information to facilitate the use of the indicators in accordance with this Regulation, including identifiers allowing for a distinction between images, videos and, where relevant, other types of material for the detection of
Amendment 1614 #
Proposal for a regulation Article 44 – paragraph 3 – subparagraph 1 The EU Centre shall generate the indicators referred to in paragraph 2, point (a), solely
Amendment 1615 #
Proposal for a regulation Article 44 – paragraph 3 – subparagraph 1 The EU Centre shall generate the indicators referred to in paragraph 2, point (a), solely on the basis of the child sexual abuse material
Amendment 1616 #
Proposal for a regulation Article 44 – paragraph 3 – subparagraph 2 Amendment 1617 #
Proposal for a regulation Article 44 – paragraph 4 4. The EU Centre shall keep records of the submissions and of the process applied to generate the indicators and compile the list referred to in the first and second subparagraphs. It shall keep those records for as long as the indicators
Amendment 1618 #
Proposal for a regulation Article 44 – paragraph 4 a (new) 4a. The EU Centre shall ensure through all technical means available that the database of indicators is secure and cannot be alterated by providers, users and any other actor at the moment of its deployment for the purpose of detection.
Amendment 1619 #
Proposal for a regulation Article 45 – paragraph 1 1. The EU Centre shall create, maintain and operate a database for the reports submitted to it by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 12(1) and assessed and processed in accordance with Article 48.
Amendment 1620 #
Proposal for a regulation Article 45 – paragraph 1 1. The EU Centre shall create, maintain and operate a database for the reports submitted to it by providers of hosting services and providers of number- independent interpersonal communications services in accordance with Article 12(1) and assessed and processed in accordance with Article 48.
Amendment 1621 #
Proposal for a regulation Article 45 – paragraph 1 1. The EU Centre shall create, maintain and operate a database for the reports submitted to it by providers of hosting services and providers of number- independent interpersonal communications services in accordance with Article 12(1) and assessed and processed in accordance with Article 48.
Amendment 1622 #
(b) where the EU Centre considered the report unfounded or manifestly unfounded, the reasons and the date and time of informing the provider in accordance with Article 48(2);
Amendment 1623 #
Proposal for a regulation Article 45 – paragraph 2 – point b (b) where the EU Centre considered the report
Amendment 1624 #
Proposal for a regulation Article 45 – paragraph 2 – point c Amendment 1625 #
Proposal for a regulation Article 45 – paragraph 2 – point c (c) where the EU Centre forwarded the report in accordance with Article 48(3), the date and time of such forwarding and the name of the competent law enforcement authority or authorities to which it forwarded the report
Amendment 1626 #
Proposal for a regulation Article 45 – paragraph 2 – point c (c) where the EU Centre forwarded the report in accordance with Article 48(3), the date and time of such forwarding and the name of the competent law enforcement authority or authorities to which it forwarded the report or
Amendment 1627 #
Proposal for a regulation Article 45 – paragraph 2 – point e (e) where available, information indicating that the provider that submitted a report concerning the dissemination of known
Amendment 1628 #
Proposal for a regulation Article 45 – paragraph 2 – point e (e) where available, information indicating that the provider that submitted a report concerning the dissemination of known
Amendment 1629 #
Proposal for a regulation Article 45 – paragraph 2 – point e (e) where available, information indicating that the provider that submitted a report concerning the dissemination of
Amendment 1630 #
Proposal for a regulation Article 45 – paragraph 2 – point g Amendment 1631 #
Proposal for a regulation Article 46 – paragraph 1 1. Subject to paragraphs 2 and 3, solely EU Centre staff and auditors duly authorised by the Executive Director and Data Protection Officer shall have access to and be entitled to process
Amendment 1632 #
Proposal for a regulation Article 46 – paragraph 1 1. Subject to paragraphs 2 and 3, solely EU Centre staff and auditors duly authorised by the Executive Director shall have access to and be entitled to process the data contained in the databases referred to in Article
Amendment 1633 #
Proposal for a regulation Article 46 – paragraph 2 Amendment 1634 #
Proposal for a regulation Article 46 – paragraph 2 2. The EU Centre shall give providers of hosting services, providers of interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to put in place voluntary measures, when authorised, and execute the detection or blocking orders that they received in accordance with Articles 7 or 16. It shall take measures to ensure that such access remains limited to what is strictly necessary for the period of application of the detection or blocking orders concerned as well as for the execution of the voluntary measures, when authorised, and that such access does not in any way endanger the proper operation of those databases and the accuracy and security of the data contained therein.
Amendment 1635 #
Proposal for a regulation Article 46 – paragraph 2 2. The EU Centre shall give providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to execute the detection or blocking orders that they received in accordance with Articles 7 or 16. It shall take measures to ensure that such access remains limited to what is strictly necessary for the period of application of the detection or blocking orders concerned and that such access does not in any way endanger the proper operation of those databases and the accuracy and security of the data contained therein.
Amendment 1636 #
Proposal for a regulation Article 46 – paragraph 2 2. The EU Centre shall give providers of hosting services, providers of number- independent interpersonal communications services
Amendment 1637 #
Proposal for a regulation Article 46 – paragraph 2 2. The EU Centre shall give providers of hosting services, providers of number- independent interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to execute the detection
Amendment 1638 #
Proposal for a regulation Article 46 – paragraph 3 Amendment 1639 #
Proposal for a regulation Article 46 – paragraph 4 Amendment 1640 #
Proposal for a regulation Article 46 – paragraph 4 4. The EU Centre shall give
Amendment 1641 #
Proposal for a regulation Article 46 – paragraph 4 a (new) 4a. The EU Centre shall give Europol access to the databases of indicators referred to in Article 44 only limited to specific data, such as hit/no hit procedure, and solely if necessary for the performance of their tasks of investigating cross-border cases of suspected child sexual abuse offences.
Amendment 1642 #
Proposal for a regulation Article 46 – paragraph 5 Amendment 1643 #
Proposal for a regulation Article 46 – paragraph 5 Amendment 1644 #
Proposal for a regulation Article 46 – paragraph 5 5. The EU Centre shall give Europol access to the databases of indicators and reports referred to in Article 4
Amendment 1645 #
Proposal for a regulation Article 46 – paragraph 6 – subparagraph 1 The EU Centre shall provide the access referred to in paragraph
Amendment 1646 #
Proposal for a regulation Article 46 – paragraph 6 – subparagraph 1 The EU Centre shall provide the access referred to in paragraphs 2, 3
Amendment 1647 #
Proposal for a regulation Article 46 – paragraph 6 – subparagraph 1 The EU Centre shall provide the access referred to in paragraphs 2, 3, 4
Amendment 1648 #
Proposal for a regulation Article 46 – paragraph 6 – subparagraph 2 The EU Centre shall duly and diligently assess those requests on a case-by-case basis, and only grant access where it considers that the requested access is necessary for and proportionate to the specified purpose. Where it considers that an access request by Europol is necessary and proportionate, it shall provided the relevant data to Europol via the Secure Information Exchange Network Application (SIENA).
Amendment 1649 #
Proposal for a regulation Article 46 – paragraph 6 – subparagraph 2 The EU Centre shall diligently assess those requests and only grant access where it considers that the requested access is necessary for and proportionate to the specified purpose, and in accordance with Union law.
Amendment 1650 #
Proposal for a regulation Article 46 – paragraph 7 7. The EU Centre shall regularly verify that the data contained in the databases referred to in Articles 44 and 45 is, in all respects, complete, accurate and up-to-date and continues to be necessary for the purposes of reporting, detection
Amendment 1651 #
Proposal for a regulation Article 46 – paragraph 7 7. The EU Centre shall regularly verify that the data contained in the databases referred to in Article
Amendment 1652 #
Proposal for a regulation Article 46 – paragraph 7 7. The EU Centre shall regularly verify that the data contained in the databases referred to in Articles 44 and 45 is, in all respects, complete, accurate and up-to-date and continues to be necessary for the purposes of reporting, detection
Amendment 1653 #
Proposal for a regulation Article 46 – paragraph 8 8. The EU Centre shall ensure that the data contained in the databases referred to in Articles 44 and 45 is stored in a
Amendment 1654 #
Proposal for a regulation Article 46 – paragraph 8 8. The EU Centre shall ensure that the data contained in the databases referred to in Article
Amendment 1655 #
Proposal for a regulation Article 46 a (new) Article46a Logging 1. The EU Centre, the Coordinating Authorities and competent authorities shall provide for logs to be kept for at least the following processing operations, in relation to tasks performed on the basis of this Regulation: collection, alteration, consultation, disclosure including transfers, combination and erasure. 2. The logs of consultation and disclosure shall make possible to establish the justification, date and time of such operations and, as far as possible, the identification of the person who consulted or disclosed the data, and the identity of the recipients of such data. 3. The logs shall be used solely for verification of the lawfulness of processing, self-monitoring, ensuring the integrity and security of the personal data, and for criminal proceedings. 4. The EU Centre, the Coordinating Authorities and competent authorities shall make the logs available to the relevant data protection supervisory authority on request.
Amendment 1656 #
Proposal for a regulation Article 47 – paragraph 1 – point a Amendment 1657 #
Proposal for a regulation Article 47 – paragraph 1 – point b Amendment 1658 #
Proposal for a regulation Article 47 – paragraph 1 – point b (b) the processing of the submissions by Coordinating Authorities, the generation of the indicators
Amendment 1659 #
Proposal for a regulation Article 47 – paragraph 1 – point d Amendment 1660 #
Proposal for a regulation Article 47 – paragraph 1 – point d Amendment 1661 #
Proposal for a regulation Article 47 – paragraph 1 – point d (d) access to the databases referred to in Article
Amendment 1662 #
Proposal for a regulation Article 47 – paragraph 1 – point e (e) the regular verifications and audits to ensure that the data contained in th
Amendment 1663 #
Proposal for a regulation Article 48 – paragraph 1 1. The EU Centre shall expeditiously assess and process reports submitted by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 12 to determine whether the reports are manifestly unfounded or are to be forwarded.
Amendment 1664 #
Proposal for a regulation Article 48 – paragraph 1 1. The EU Centre shall expeditiously assess and process reports submitted by providers of hosting services and providers of number-independent interpersonal communications services in accordance with Article 12 to determine whether the reports are
Amendment 1665 #
Proposal for a regulation Article 48 – paragraph 1 1. The EU Centre shall expeditiously assess and process reports submitted by providers of hosting services and providers of interpersonal communications services in accordance with Article 12 to determine whether the reports are
Amendment 1666 #
Proposal for a regulation Article 48 – paragraph 1 – subparagraph 1 (new) The EU Centre shall make a free telephone number available to users that shall provide them with assistance in the event of a suspected violation of the provisions of this regulation.
Amendment 1667 #
Proposal for a regulation Article 48 – paragraph 1 a (new) 1a. Where the EU Centre receives a report from a Hotline, or from a provider who indicated that the report is based on the information received from a Hotline, the EU Centre shall monitor the removal of child sexual abuse material or cooperate with the Hotline to track its status to avoid duplicated reporting on the same material that has already been reported to the national law enforcement authorities.
Amendment 1668 #
Proposal for a regulation Article 48 – paragraph 2 2. Where the EU Centre considers that the report is
Amendment 1669 #
Proposal for a regulation Article 48 – paragraph 2 2. Where the EU Centre considers that the report is
Amendment 1670 #
Proposal for a regulation Article 48 – paragraph 3 – subparagraph 1 Where the EU Centre considers that a report is not
Amendment 1671 #
Proposal for a regulation Article 48 – paragraph 3 – subparagraph 1 Where, after a thorough legal and factual assessment, the EU Centre considers that a report is not
Amendment 1672 #
Proposal for a regulation Article 48 – paragraph 3 – subparagraph 1 Where the EU Centre considers that a report is not manifestly unfounded, it
Amendment 1673 #
Proposal for a regulation Article 48 – paragraph 3 – subparagraph 2 Amendment 1674 #
Proposal for a regulation Article 48 – paragraph 3 – subparagraph 2 Where that competent law enforcement authority or those competent law enforcement authorities cannot be determined with sufficient certainty, the EU Centre
Amendment 1675 #
Proposal for a regulation Article 48 – paragraph 3 – subparagraph 2 Amendment 1676 #
Proposal for a regulation Article 48 – paragraph 6 – introductory part 6. Where
Amendment 1677 #
Proposal for a regulation Article 48 – paragraph 6 – point b (b) where the provider that submitted the report is a provider of hosting services and the report concerns the potential dissemination of child sexual abuse material, communicate to the provider that it is not to remove
Amendment 1678 #
Proposal for a regulation Article 48 – paragraph 7 7. The time periods referred to in the first subparagraph of paragraph 6, points (a) and (b), shall be those specified in the competent law enforcement authority’s request to the EU Centre
Amendment 1679 #
Proposal for a regulation Article 48 – paragraph 8 8. The EU Centre shall verify whether a provider of hosting services that submitted a report concerning the potential dissemination of child sexual abuse material removed
Amendment 1680 #
Proposal for a regulation Article 48 – paragraph 8 8. The EU Centre shall verify whether a provider of hosting services that submitted a report concerning the potential dissemination of child sexual abuse material removed
Amendment 1681 #
Proposal for a regulation Article 48 – paragraph 8 a (new) Amendment 1682 #
Proposal for a regulation Article 48 – paragraph 8 a (new) 8a. The EU Center shall not retain the personal data contained in the reports it receives for a period longer than two working days. This period may be extended by up to one week where duly justified and documented.
Amendment 1683 #
Proposal for a regulation Article 48 – paragraph 8 b (new) 8b. The EU Center shall keep logs for any of the following processing operations in automated processing systems: the entry, alteration, access, consultation, disclosure, combination and erasure of personal data. The logs of consultation and disclosure shall make possible to establish the justification for, and the date and time of, such operations, the identification of the person who consulted or disclosed operational personal data, and, as far as possible, the identity of the recipients. These logs shall be used for verification of the lawfulness of processing, self-monitoring, and for ensuring its integrity and security. These logs shall be made available to the EU Centre’s data protection officer and to the EDPS on request. Such logs shall be deleted after three years, unless they are required for ongoing control.
Amendment 1684 #
Proposal for a regulation Article 49 Amendment 1685 #
Proposal for a regulation Article 49 – paragraph 1 – introductory part 1. The EU Centre shall have the power to conduct searches on hosting services for the dissemination of publicly accessible child sexual abuse material,
Amendment 1686 #
Proposal for a regulation Article 49 – paragraph 1 – introductory part 1. The EU Centre shall have the power to conduct searches on hosting services for the dissemination of publicly accessible child sexual abuse material, using the relevant indicators from the database of indicators referred to in Article 44(1), point
Amendment 1687 #
Proposal for a regulation Article 49 – paragraph 1 – introductory part 1. The EU Centre shall have the power to conduct searches of publicly accessible content on hosting
Amendment 1688 #
Proposal for a regulation Article 49 – paragraph 1 – point a (a) where so requested to support a
Amendment 1689 #
Proposal for a regulation Article 49 – paragraph 1 – point b (b) where so requested to assist a Coordinating Authority by verifying the possible need for
Amendment 1690 #
Proposal for a regulation Article 49 – paragraph 1 – point b a (new) (ba) proactively of its own initiative by systematically and automatically analysing and following publicly accessible uniform resource locators (web crawling).
Amendment 1691 #
Proposal for a regulation Article 49 – paragraph 2 – subparagraph 1 The EU Centre shall have the power to notify, after having conducted the searches referred to in paragraph 1, the Coordinating Authoriy to request a removal order persuant to Article 14 and the providers of hosting services of the presence of one or more specific items of known child sexual abuse material on their services and request them to remove
Amendment 1692 #
Proposal for a regulation Article 49 – paragraph 2 – subparagraph 1 The EU Centre shall have the power to notify, after having conducted the searches referred to in paragraph 1, providers of hosting services of the presence of one or more specific items of known child sexual abuse material on their services and request them to remove
Amendment 1693 #
Proposal for a regulation Article 49 – paragraph 3 3. Where
Amendment 1694 #
Proposal for a regulation Article 50 – title Amendment 1695 #
Proposal for a regulation Article 50 – paragraph 1 Amendment 1696 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 1 The EU Centre shall make available technologies that providers of hosting services and providers of number- independent interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1). The EU Centre shall provide recommended mitigating measures and relevant best practices that are in particular effective in identifying child sexual abuse material that result from the operation of providers’ mitigating measures, in accordance with Article 4 of the Regulation.
Amendment 1697 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 1 The EU Centre shall make available: (i) technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1). (ii) technologies that providers of end-to- end encrypted electronic communication services may acquire, install and operate, free of charge, where relevant subject to reasonable licencing conditions, to adopt the security measures imposed on them by Article 7(3)(a).
Amendment 1698 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 1 The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal
Amendment 1699 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 1 The EU Centre shall make available technologies that providers of hosting services and providers of number independent interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1).
Amendment 1700 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 1 a (new) The EU Centre shall provide recommended mitigating measures and relevant best practices that are in particular effective in identifying child sexual abuse material that result from the operation of providers’ mitigating measures, in accordance with Article 4 of the Regulation.
Amendment 1701 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 2 To that aim, the EU Centre shall compile lists of such technologies, having regard to the requirements of this Regulation and in particular those of Article 10(2) and Article 19a (new).
Amendment 1702 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 3 Before including specific technologies on those lists, the EU Centre shall request the authoritative opinion of its Technology Committee and of the European Data Protection Board, which it shall fully take into account. The Technology Committee and the European Data Protection Board shall deliver their respective opinions within eight weeks. That period may be extended by a further six weeks where necessary, taking into account the complexity of the subject matter. The Technology Committee and the European Data Protection Board shall inform the EU Centre of any such extension within one month of receipt of the request for consultation, together with the reasons for the delay. EU Center shall inform the European Data Protection Board of the action it has taken following its opinion, which shall have the right to object to the inclusion of the specific technology in the lists if it deems that its opinion has not been duly taken into consideration. This opinion shall be notwithstanding the case- by-case assessment of the intended processing by the relevant controller under articles 35 and 36 of Regulation 2016/679.
Amendment 1703 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 3 Before including specific technologies on those lists, the EU Centre shall request the opinions of its Technology Committee and Victims’ Consultative Forum, and, upon request of the European Commission, the opinion of the European Data Protection Board. The Technology Committee and the European Data Protection Board shall deliver their respective opinions within eight weeks. That period may be extended by a further six weeks where necessary, taking into account the complexity of the subject matter. The Technology Committee and the European Data Protection Board shall inform the EU Centre of any such extension within one month of receipt of the request for consultation, together with the reasons for the delay. Where the EU Centre substantially deviates from those opinions, it shall inform the Technology Committee, the Victims' Consultative Forum, or the European Data Protection Board and the Commission thereof, specifying the points where it deviated and the main reasons for that deviation.
Amendment 1704 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 3 Before including specific technologies on those lists, the EU Centre shall request the opinion of its Technology Committee, the Experts Consultative Forum, and of the European Data Protection Board. The Technology Committee and the European Data Protection Board shall deliver their respective opinions within eight weeks. That period may be extended by a further six weeks where necessary, taking into account the complexity of the subject matter. The Technology Committee and the European Data Protection Board shall inform the EU Centre of any such extension within one month of receipt of the request for consultation, together with the reasons for the delay. Where the EU Centre substantially deviates from those opinions, it shall inform the Technology Committee or the European Data Protection Board and the Commission thereof, specifying the points at which it deviated and the main reasons for the deviation.
Amendment 1705 #
Before including specific technologies on those lists, the EU Centre shall request the opinion of its Technology Committee and of the European Data Protection Board. The Technology Committee and the European Data Protection Board shall deliver their respective opinions within
Amendment 1706 #
Proposal for a regulation Article 50 – paragraph 1 – subparagraph 3 a (new) The EU Centre shall respect the positions and findings in the opinion provided by the European Data Protection Board before making specific technologies available.
Amendment 1707 #
Proposal for a regulation Article 50 – paragraph 2 – introductory part 2. The EU Centre shall collect, record, a
Amendment 1708 #
Proposal for a regulation Article 50 – paragraph 2 – point a (a) information obtained in the performance of its tasks under this Regulation concerning
Amendment 1709 #
Proposal for a regulation Article 50 – paragraph 2 – point a (a) information obtained in the performance of its tasks under this Regulation concerning detection, reporting, removal
Amendment 1710 #
Proposal for a regulation Article 50 – paragraph 2 – point a (a) information obtained in the performance of its tasks under this Regulation concerning
Amendment 1711 #
Proposal for a regulation Article 50 – paragraph 2 – point c (c) information resulting from research or other activities conducted by Member States’ authorities, other Union institutions, bodies, offices and agencies, the competent authorities of third countries, international organisations, research centres, hotlines and civil society organisations.
Amendment 1712 #
Proposal for a regulation Article 50 – paragraph 2 – point c a (new) (ca) information obtained in the performance of its tasks under this Regulation concerning victim assistance and support.
Amendment 1713 #
Proposal for a regulation Article 50 – paragraph 3 3. Where necessary for the performance of its tasks under this Regulation, the EU Centre shall carry out, participate in or encourage research, surveys and studies, either on its own initiative or, where appropriate and compatible with its priorities and its annual work programme, at the request of the European Parliament, the Council or the Commission. The EU Centre shall support Member States and the Coordinating Authorities in conducting research, taking into account national specificities. The collected knowledge shall serve as a tool to elaborate prevention methods adapted and implemented by Coordinating Authorities in each Member State.
Amendment 1714 #
Proposal for a regulation Article 50 – paragraph 3 a (new) 3a. The outcome of researches, surveys or studies carried out or led by the EU Centre shall be made publicly available.
Amendment 1715 #
Proposal for a regulation Article 50 – paragraph 4 Amendment 1716 #
Proposal for a regulation Article 50 – paragraph 5 5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. Communication campaigns shall be easily understandable and accessible to all children, their families and educators in formal, and non-formal education in the Union, aiming to improve digital literacy and ensure a safe digital environment for children. Communication campaigns shall take into account the gender dimension of the crime.
Amendment 1717 #
Proposal for a regulation Article 50 – paragraph 5 5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. Communication campaigns shall be easily understandable and accessible to all children, their families and educators in formal, and non-formal education in the Union, aiming to improve digital literacy and foster a safe digital environment for children.
Amendment 1718 #
Proposal for a regulation Article 50 – paragraph 5 5. The EU Centre shall
Amendment 1719 #
Proposal for a regulation Article 51 – title Processing activities and
Amendment 1720 #
Proposal for a regulation Article 51 – paragraph 2 – point a Amendment 1721 #
Proposal for a regulation Article 51 – paragraph 2 – point b Amendment 1722 #
Proposal for a regulation Article 51 – paragraph 2 – point b Amendment 1723 #
Proposal for a regulation Article 51 – paragraph 2 – point c Amendment 1724 #
Proposal for a regulation Article 51 – paragraph 2 – point c Amendment 1725 #
Proposal for a regulation Article 51 – paragraph 2 – point d (d) cooperating with Coordinating Authorities in accordance with Articles 20 and 21 on tasks related to
Amendment 1726 #
Proposal for a regulation Article 51 – paragraph 2 – point h Amendment 1727 #
Proposal for a regulation Article 51 – paragraph 2 – point i Amendment 1728 #
Proposal for a regulation Article 51 – paragraph 2 – point k (k) providing and monitoring access to the database
Amendment 1729 #
Proposal for a regulation Article 51 – paragraph 2 – point m (m) assessing and processing reports of potential
Amendment 1730 #
Proposal for a regulation Article 51 – paragraph 2 – point n Amendment 1731 #
Proposal for a regulation Article 51 – paragraph 2 – point n (n) cooperating with Europol and partner organisations in accordance with Articles 53 and 54, including on tasks related to the identification of
Amendment 1732 #
Proposal for a regulation Article 51 – paragraph 3 Amendment 1733 #
Proposal for a regulation Article 51 – paragraph 3 a (new) 3a. Personal data referred to in paragraph 2 shall be processed under the following principles. They shall be (a) processed lawfully and fairly (‘lawfulness and fairness’); (b) collected for specified, explicit and legitimate purposes and not processed in a manner that is incompatible with those purposes (‘purpose limitation’); (c) adequate, relevant, and not excessive in relation to the purposes for which they are processed (‘data minimisation’); (d) accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay (‘accuracy’); (e) kept in a form which permits identification of data subjects for no longer than is strictly necessary for the purposes for which the personal data are processed (‘storage limitation’).
Amendment 1734 #
Proposal for a regulation Article 51 – paragraph 4 Amendment 1735 #
Proposal for a regulation Article 51 – paragraph 4 4. It shall ensure that the personal data is stored in a
Amendment 1736 #
Proposal for a regulation Article 51 a (new) Amendment 1737 #
Proposal for a regulation Article 52 – paragraph 2 2. Contact officers shall assist in the exchange of information between the EU Centre and the Coordinating Authorities that designated them.
Amendment 1738 #
Proposal for a regulation Article 53 – paragraph 1 1. Where necessary for the performance of its tasks under this Regulation, within their respective mandates, the EU Centre shall
Amendment 1739 #
Proposal for a regulation Article 53 – paragraph 1 a (new) 1a. Europol and the EU Centre shall cooperate with the NCMEC center in the fight against child sexual abuse material. This cooperation may consist of sharing their databases of known child sexual abuse materials.
Amendment 1740 #
Proposal for a regulation Article 53 – paragraph 2 Amendment 1741 #
Proposal for a regulation Article 53 – paragraph 2 Amendment 1742 #
Proposal for a regulation Article 53 – paragraph 2 – subparagraph 1 Amendment 1743 #
Proposal for a regulation Article 53 – paragraph 2 – subparagraph 1 Europol and the EU Centre shall provide each other with the fullest possible access to relevant information and information systems, where necessary for the performance of their respective tasks and in accordance with the acts of Union law regulating such access. Any access to personal data processed in Europol's information systems, where deemed stricly necessary for the performance of the EU Centre's tasks, shall be granted only case- by-case basis, upon submission of an explicit request, which indicates the specific purpose and justification. Europol shall be required to diligentely assess those requests and only transmit personal data to the EU Centre where strictly necessary and proprotionate to the required purpose.
Amendment 1744 #
Proposal for a regulation Article 53 – paragraph 2 – subparagraph 1 Europol and the EU Centre shall provide each other with
Amendment 1745 #
Proposal for a regulation Article 53 – paragraph 2 – subparagraph 2 Amendment 1746 #
Proposal for a regulation Article 53 – paragraph 2 – subparagraph 2 Amendment 1747 #
Proposal for a regulation Article 53 – paragraph 2 – subparagraph 2 Amendment 1748 #
Proposal for a regulation Article 53 – paragraph 2 – subparagraph 2 a (new) The EU Centre shall operate independently of Europol and other law enforcement bodies.
Amendment 1749 #
Proposal for a regulation Article 53 – paragraph 2 a (new) 2a. Any transfer of personal data to Europol is governed by Regulation 2018/1725.
Amendment 1750 #
Proposal for a regulation Article 53 – paragraph 3 Amendment 1751 #
Proposal for a regulation Article 53 – paragraph 3 3. The terms of cooperation
Amendment 1752 #
Proposal for a regulation Article 53 – paragraph 3 3. The terms of cooperation and working arrangements shall be laid down in a publically accessible memorandum of understanding.
Amendment 1753 #
Proposal for a regulation Article 53 – paragraph 3 3. The terms of cooperation and working arrangements shall be laid down in a publically accessible memorandum of understanding.
Amendment 1754 #
Proposal for a regulation Article 54 – title Cooperation with
Amendment 1755 #
Proposal for a regulation Article 54 – paragraph 1 1. Where necessary for the performance of its tasks under this Regulation, the EU Centre may cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations acting in the public interest, hotlines and semi-public organisations.
Amendment 1756 #
Proposal for a regulation Article 54 – paragraph 1 1. Where necessary for the performance of its tasks under this Regulation, the EU Centre
Amendment 1757 #
Proposal for a regulation Article 54 – paragraph 1 1.
Amendment 1758 #
Proposal for a regulation Article 54 – paragraph 1 a (new) 1a. In particular, the cooperation with the EU Centre referred to in paragraph 1 may include the following: (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(8), Article 4(5), Article 6(4) and Article 11; (b) updating the databases of indicators referred to in Article 44; (c) innovating new and existing detection technologies; (d) making technologies available to providers for the execution of detection orders issued to them, in accordance with Article 50(1).
Amendment 1759 #
Proposal for a regulation Article 54 – paragraph 2 Amendment 1760 #
Proposal for a regulation Article 54 – paragraph 2 2. The EU Centre
Amendment 1761 #
Proposal for a regulation Article 54 – paragraph 2 a (new) 2a. The EU Centre shall cooperate with other organisations and bodies carrying out similar functions in other jurisdictions, such as the National Centre for Missing and Exploited Children (‘NCMEC’) and the Canadian Centre for Child Protection, among others, which serve the same purpose of this Regulation, as well as in order to avoid potential duplication of reporting obligations for providers.
Amendment 1762 #
Proposal for a regulation Article 55 – paragraph 1 – introductory part The administrative and management structure of the EU Centre shall be gender- balanced and comprise:
Amendment 1763 #
Proposal for a regulation Article 55 – paragraph 1 – point d a (new) (da) a Survivors Advisory Board, which shall exercise the tasks set out in Article 66a.
Amendment 1764 #
Proposal for a regulation Article 55 – paragraph 1 – point d a (new) (da) a Fundamental Rights Officer, which shall exercise the tasks set out in Article 66b;
Amendment 1765 #
Proposal for a regulation Article 55 – paragraph 1 – point d b (new) (db) an Expert's Consultative Forum, which shall exercise the tasks set out in Article 66a;
Amendment 1766 #
Proposal for a regulation Article 56 – paragraph 1 1. The Management Board shall be composed of one representative from each Member State and
Amendment 1767 #
Proposal for a regulation Article 56 – paragraph 1 1. The Management Board shall be gender-balanced and composed of one representative from each Member State and two representatives of the Commission, all as members with voting rights.
Amendment 1768 #
Proposal for a regulation Article 56 – paragraph 1 1. The Management Board shall be composed of one representative from each
Amendment 1769 #
Proposal for a regulation Article 56 – paragraph 1 – subparagraph 1 (new) One member of the Technology Commitee and one member of the Survivors Advisory Board as established in Articles 66 and 66a may attend the meetings of the Management Board as observers.
Amendment 1770 #
Proposal for a regulation Article 56 – paragraph 2 – subparagraph 1 Amendment 1771 #
Proposal for a regulation Article 56 – paragraph 2 – subparagraph 2 Amendment 1772 #
Proposal for a regulation Article 56 – paragraph 2 – subparagraph 2 Amendment 1773 #
Proposal for a regulation Article 56 – paragraph 3 3. Each member of the Management Board shall have an alternate. The alternate shall represent the member in
Amendment 1774 #
Proposal for a regulation Article 56 – paragraph 3 3. Each member of the Management Board shall have an alternate. The alternate shall represent the member in
Amendment 1775 #
Proposal for a regulation Article 56 – paragraph 4 4. Members of the Management Board and their alternates shall be appointed in the light of their knowledge in the field of combating child sexual abuse, taking into account relevant managerial, administrative and budgetary skills. Member States shall appoint a representative of their Coordinating Authority, within four months of [date of entry into force of this Regulation]. All parties represented in the Management Board shall make efforts to limit turnover of their representatives, in order to ensure continuity of its work. All parties shall
Amendment 1776 #
Proposal for a regulation Article 56 – paragraph 4 4. Members of the Management Board and their alternates shall be appointed in the light of their
Amendment 1777 #
Proposal for a regulation Article 57 – paragraph 1 – point c (c) adopt rules for the prevention and
Amendment 1778 #
Proposal for a regulation Article 57 – paragraph 1 – point f (f) appoint the members of the Technology Committee, of the Children's Rights and Survivors Advisory Board and of any other advisory group it may establish;
Amendment 1779 #
Proposal for a regulation Article 57 – paragraph 1 – point f (f) appoint the members of the Technology Committee, the Expert's Consultative Forum and of any other advisory group it may establish;
Amendment 1780 #
Proposal for a regulation Article 57 – paragraph 1 – point f a (new) (fa) appoint a Data Protection Officer;
Amendment 1781 #
Proposal for a regulation Article 57 – paragraph 1 – point f b (new) (fb) appoint a Fundamental Rights Officer;
Amendment 1782 #
Proposal for a regulation Article 57 – paragraph 1 – point g Amendment 1783 #
Proposal for a regulation Article 57 – paragraph 1 – point h a (new) (ha) consult the Survivors’ Advisory Board as regards the obligations referred to in points (a), and (h) of this Article.
Amendment 1784 #
Proposal for a regulation Article 58 – paragraph 1 – subparagraph 2 The Deputy Chairperson shall automatically replace the Chairperson
Amendment 1785 #
Proposal for a regulation Article 60 – paragraph 2 2. Each member shall have one vote. In the absence of a member,
Amendment 1786 #
Proposal for a regulation Article 61 – paragraph 1 – subparagraph 1 The Executive Board shall be gender- balanced and composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and two representatives of the Commission to the Management Board. The Chairperson of the Management Board shall also be the Chairperson of the Executive Board. The composition of the Executive Board shall take into consideration gender balance with at least 40% is of each sex.
Amendment 1787 #
Proposal for a regulation Article 61 – paragraph 1 – subparagraph 1 The Executive Board shall be composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and
Amendment 1788 #
Proposal for a regulation Article 61 – paragraph 1 – subparagraph 1 The Executive Board shall be composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and
Amendment 1789 #
Proposal for a regulation Article 62 – paragraph 2 – point j (j) appoint an Accounting Officer, who may be the Commission's Accounting Officer, subject to the Staff Regulations and the Conditions of Employment of other servants, who shall be totally independent in the performance of
Amendment 1790 #
Proposal for a regulation Article 62 – paragraph 2 – point p (p) authorise the conclusion of memoranda of understanding referred to in Article
Amendment 1791 #
Proposal for a regulation Article 64 – paragraph 2 2. The Executive Director shall report to the European Parliament on the performance of
Amendment 1792 #
Proposal for a regulation Article 64 – paragraph 4 – point e a (new) (ea) implementing gender mainstreaming and gender budgeting in all areas, including drafting a gender action plan (GAP);
Amendment 1793 #
Proposal for a regulation Article 64 – paragraph 4 – point f (f) preparing the Consolidated Annual Activity Report (CAAR) on the EU Centre’s activities, including the activities of the Technology Committee and the Survivors’ Advisory Board, and presenting it to the Executive Board for assessment and adoption;
Amendment 1794 #
Proposal for a regulation Article 64 – paragraph 4 – point g (g) preparing an action plan following- up conclusions of internal or external audit reports and evaluations, as well as investigations by the European Anti-Fraud Office (OLAF) and by the European Public Prosecutor’s Office (EPPO) and reporting on progress twice a year to the Commission and the European Parliament and regularly to the Management Board and the Executive Board;
Amendment 1795 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical and data protection experts appointed by the Management Board in view of their excellence and their independence from corporate interests, following the publication of a call for expressions of interest in the Official Journal of the European Union. Its members shall be appointed for a term of four years, renewable once. On the expiry of their term of office, members shall remain in office until they are replaced or until their appointments are renewed. If a member resigns before the expiry of his or her term of office, he or she shall be replaced for the remainder of the term by a member appointed by the Management Board.
Amendment 1796 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical experts appointed by the Management Board in view of their excellence, particular expertise in upholding privacy and data protection and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union.
Amendment 1797 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical experts, in particular privacy and data protection experts appointed by the Management Board in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union.
Amendment 1798 #
Proposal for a regulation Article 66 – paragraph 1 1. The Technology Committee shall consist of technical, privacy and data protection experts appointed by the Management Board in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union.
Amendment 1799 #
Proposal for a regulation Article 66 – paragraph 1 a (new) 1a. The Technology Committee shall have equal representation in terms of gender.
Amendment 1800 #
Proposal for a regulation Article 66 – paragraph 4 4. When a member no longer meets the criteria of independence, he or she shall inform the Management Board. Alternatively, the Management Board may declare, on a proposal of at least one third of its members or of the Commission, a lack of independence and revoke appointment of the person concerned. The Management Board shall appoint a new member for the remaining term of office in accordance with the procedure for ordinary members.
Amendment 1801 #
Proposal for a regulation Article 66 – paragraph 6 – point a Amendment 1802 #
Proposal for a regulation Article 66 – paragraph 6 – point b (b) contribute to the EU Centre’s assistance to the Coordinating Authorities, the Management Board, the Executive Board and the Executive Director, in respect of matters related to the use of technology and data protection;
Amendment 1803 #
Proposal for a regulation Article 66 – paragraph 6 – point c (c) provide internally, upon request, expertise on matters related to the use of technology and data protection for the purposes of prevention and detection of child sexual abuse online.
Amendment 1804 #
Proposal for a regulation Article 66 – paragraph 6 a (new) 6a. (d) evaluate the effectiveness of new and existing detection technology through unknown datasets of verified indicators. (e) establish best practices on safety by design and the voluntary use of technologies, including prevention and detection technologies, as part of providers’ mitigation measures. (f) introduce a regular reviewing and reporting process to assess and share expertise on the most recent technological innovations and developments related to detection technology.
Amendment 1805 #
Proposal for a regulation Article 66 a (new) Amendment 1806 #
Proposal for a regulation Article 66 a (new) Amendment 1807 #
Proposal for a regulation Chapter IV – Section 5 – Part 3 a (new) Amendment 1808 #
Proposal for a regulation Article 83 – paragraph 1 – introductory part 1. Providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services shall collect data on the following topics and make that information available to the EU Centre upon request:
Amendment 1809 #
Proposal for a regulation Article 83 – paragraph 1 – introductory part 1. Providers of hosting services, providers of number-independent interpersonal communications services and providers of internet access services shall collect data on the following topics and make that information available to the EU Centre upon request:
Amendment 1810 #
Proposal for a regulation Article 83 – paragraph 1 – introductory part 1. Providers of hosting services
Amendment 1811 #
Proposal for a regulation Article 83 – paragraph 1 – introductory part 1. Providers of hosting services, providers of interpersonal communications services
Amendment 1812 #
Proposal for a regulation Article 83 – paragraph 1 – point a Amendment 1813 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 1 – the measures taken to comply with the order
Amendment 1814 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 2 – the
Amendment 1815 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 2 – the error rates of the technologies deployed to detect
Amendment 1816 #
Proposal for a regulation Article 83 – paragraph 1 – point a – indent 2 a (new) - including the rates of false positives and negatives, and confirmed positives and negatives
Amendment 1817 #
Proposal for a regulation Article 83 – paragraph 1 – point b (b) the number of removal orders issued to the provider in accordance with Article 14 and the average time
Amendment 1818 #
Proposal for a regulation Article 83 – paragraph 1 – point b (b) the number of removal orders issued to the provider in accordance with Article 14 and the average time needed for removing
Amendment 1819 #
Proposal for a regulation Article 83 – paragraph 1 – point b a (new) (ba) the number and duration of delays to removals as a result of requests from competent authorities or law enforcement authorities;
Amendment 1820 #
Proposal for a regulation Article 83 – paragraph 1 – point c (c) the total number of items of child sexual abuse material that the provider removed or to which it disabled access,
Amendment 1821 #
Proposal for a regulation Article 83 – paragraph 1 – point c (c) the total number of items of child sexual abuse material that the provider removed
Amendment 1822 #
Proposal for a regulation Article 83 – paragraph 1 – point c a (new) (ca) the number of instances that the provider was asked to provide additional support to law enforcement authorities in relation to content that was removed;
Amendment 1823 #
Proposal for a regulation Article 83 – paragraph 1 – point d Amendment 1824 #
Proposal for a regulation Article 83 – paragraph 1 – point d Amendment 1825 #
Proposal for a regulation Article 83 – paragraph 1 – point d Amendment 1826 #
Proposal for a regulation Article 83 – paragraph 1 – point e (e) the number of instances in which the provider invoked Article
Amendment 1827 #
Proposal for a regulation Article 83 – paragraph 1 – point e (e) the number of instances in which the provider invoked Article 8(3), Article 14(5) or (6)
Amendment 1828 #
Proposal for a regulation Article 83 – paragraph 1 – point e (e) the number of instances in which the provider invoked Article 8(3), Article 14(5) or (6)
Amendment 1829 #
Proposal for a regulation Article 83 – paragraph 1 – point e a (new) (ea) Educational and awareness- raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, where possible, including the impact, outreach and effectiveness of the activities carried out on the targeted audience, disaggregated into different categories based on demographics
Amendment 1830 #
Proposal for a regulation Article 83 – paragraph 1 – point e b (new) Amendment 1831 #
Proposal for a regulation Article 83 – paragraph 2 – introductory part 2. The Coordinating Authorities shall collect data on the following topics and make that information
Amendment 1832 #
Proposal for a regulation Article 83 – paragraph 2 – point a – indent -1 (new) -1 the nature of the report and its key characteristics such as if the security of the hosting service was allegedly breached;
Amendment 1833 #
Proposal for a regulation Article 83 – paragraph 2 – point a – indent 2 – where the report led to the launch of a criminal investigation or contributed to an ongoing investigation, the state of play or outcome of the investigation, including whether the case was closed at pre-trial stage, whether the case led to the imposition of penalties, whether
Amendment 1834 #
Proposal for a regulation Article 83 – paragraph 2 – point b Amendment 1835 #
Proposal for a regulation Article 83 – paragraph 2 – point b (b) the most important and recurrent
Amendment 1836 #
Proposal for a regulation Article 83 – paragraph 2 – point b (b) the most important and recurrent risks of online child sexual abuse, as reported by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 3 or identified through other information available to the Coordinating Authority;
Amendment 1837 #
Proposal for a regulation Article 83 – paragraph 2 – point b (b) the most important and recurrent risks of online child sexual abuse, as reported by providers of hosting services and providers of number-independent interpersonal communications services in accordance with Article 3 or identified through other information available to the Coordinating Authority;
Amendment 1838 #
Proposal for a regulation Article 83 – paragraph 2 – point c Amendment 1839 #
Proposal for a regulation Article 83 – paragraph 2 – point c (c) a list of the providers of hosting services and providers of number- independent interpersonal communications services to which the Coordinating Authority addressed a
Amendment 1840 #
Proposal for a regulation Article 83 – paragraph 2 – point c (c) a list of the providers of hosting services and providers of number- independent interpersonal communications services to which the Coordinating Authority addressed a detection order in accordance with Article 7;
Amendment 1841 #
Proposal for a regulation Article 83 – paragraph 2 – point c (c) a list of the providers of hosting services and providers of number independent interpersonal communications services to which the Coordinating Authority addressed a detection order in accordance with Article 7;
Amendment 1842 #
Proposal for a regulation Article 83 – paragraph 2 – point d Amendment 1843 #
Proposal for a regulation Article 83 – paragraph 2 – point f (f) the number of removal orders issued in accordance with Article 14, broken down by provider, the time needed to remove or disable access to the item or items of child sexual abuse material concerned, including the time it took the Coordinating Authority to process the order, and the number of instances in which the provider invoked Article 14(5) and (6);
Amendment 1844 #
Proposal for a regulation Article 83 – paragraph 2 – point f (f) the number of removal orders
Amendment 1845 #
Proposal for a regulation Article 83 – paragraph 2 – point f (f) the number of removal orders issued in accordance with Article 14, broken down by provider, the time needed to remove
Amendment 1846 #
Proposal for a regulation Article 83 – paragraph 2 – point g Amendment 1847 #
Proposal for a regulation Article 83 – paragraph 2 – point g Amendment 1848 #
Proposal for a regulation Article 83 – paragraph 2 – point g Amendment 1849 #
Proposal for a regulation Article 83 – paragraph 2 – point i Amendment 1850 #
Proposal for a regulation Article 83 – paragraph 2 – point i a (new) (ia) the measures taken regarding prevention and victim assistance programmes, including the number of children in primary education who are taking part in awareness raising campaigns and through education programmes about the risks of all forms of sexual exploitation of children, including in the online environment.
Amendment 1851 #
Proposal for a regulation Article 83 – paragraph 3 – introductory part 3. The EU Centre shall collect data and generate statistics on the
Amendment 1852 #
Proposal for a regulation Article 83 – paragraph 3 – introductory part 3. The EU Centre shall collect data and generate statistics on the detection, reporting, removal of or disabling of access to online child sexual abuse under this Regulation. The data shall
Amendment 1853 #
Proposal for a regulation Article 83 – paragraph 3 – introductory part 3. The EU Centre shall collect data and generate statistics on the detection, reporting, removal of
Amendment 1854 #
Proposal for a regulation Article 83 – paragraph 3 – point a Amendment 1855 #
Proposal for a regulation Article 83 – paragraph 3 – point a (a) the number of indicators in the databases of indicators referred to in Article 44 and the
Amendment 1856 #
Proposal for a regulation Article 83 – paragraph 3 – point b (b) the number of submissions of child sexual abuse material
Amendment 1857 #
Proposal for a regulation Article 83 – paragraph 3 – point b (b) the number of submissions of child sexual abuse material
Amendment 1858 #
Proposal for a regulation Article 83 – paragraph 3 – point c (c) the total number of reports submitted to the EU Centre in accordance with Article 12, broken down by provider of hosting services and provider of number-independent interpersonal communications services that submitted the report and by Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3);
Amendment 1859 #
Proposal for a regulation Article 83 – paragraph 3 – point c (c) the total number of reports submitted to the EU Centre in accordance with Article 12, broken down by provider of hosting services and provider of number-independent interpersonal communications services that submitted the report and by Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3);
Amendment 1860 #
Proposal for a regulation Article 83 – paragraph 3 – point c (c) the total number of reports submitted to the EU Centre in accordance
Amendment 1861 #
Proposal for a regulation Article 83 – paragraph 3 – point c a (new) (ca) the total number of reports forwarded to Europol in accordance with Article 48, as well as the number of access requests received from Europol under Article 46(4) and 46(5), including the number of those requests granted and refused by the EU Centre.
Amendment 1862 #
Proposal for a regulation Article 83 – paragraph 3 – point d (d) the online child sexual abuse to which the reports relate, including the number of items of potential
Amendment 1863 #
Proposal for a regulation Article 83 – paragraph 3 – point d (d) the online child sexual abuse to which the reports relate, including the number of items of potential known
Amendment 1864 #
Proposal for a regulation Article 83 – paragraph 3 – point e (e) the number of reports that the EU Centre considered unfounded or manifestly unfounded, as referred to in Article 48(2);
Amendment 1865 #
Proposal for a regulation Article 83 – paragraph 3 – point f Amendment 1866 #
Proposal for a regulation Article 83 – paragraph 3 – point f (f) the number of reports relating to potential
Amendment 1867 #
Proposal for a regulation Article 83 – paragraph 3 – point g Amendment 1868 #
Proposal for a regulation Article 83 – paragraph 3 – point h (h) where materially the same item of potential child sexual abuse material was reported more than once to the EU Centre in accordance with Article 12 or detected more than once through the searches in accordance with Article 49(1), the number of times that that item was reported or detected in that manner.
Amendment 1869 #
Proposal for a regulation Article 83 – paragraph 3 – point j (j) the number of victims of online child sexual abuse assisted by the EU Centre pursuant to Article 21(2), and the number of these victims that requested to receive such assistance in a manner accessible to them due to disabilities.
Amendment 1870 #
Proposal for a regulation Article 83 – paragraph 3 – point j (j) number of
Amendment 1871 #
Proposal for a regulation Article 83 – paragraph 3 – point j a (new) (ja) the measures taken by Member States regarding prevention, awareness raising, and victim assistance programmes, including the impact, outreach and effectiveness of the activities carried out on the targeted audience, where possible, disaggregated into different categories based on demographics and including best practices and lessons learned of prevention programmes.
Amendment 1872 #
Proposal for a regulation Article 83 – paragraph 4 4. The providers of hosting services
Amendment 1873 #
Proposal for a regulation Article 83 – paragraph 4 4. The providers of hosting services, providers of number-independent interpersonal communications services and providers of internet access services, the Coordinating Authorities and the EU Centre shall ensure that the data referred to in paragraphs 1, 2 and 3, respectively, is stored no longer than is necessary for the transparency reporting referred to in Article 84. The data stored shall not contain any personal data.
Amendment 1874 #
Proposal for a regulation Article 83 – paragraph 4 4. The providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services, the Coordinating Authorities and the EU Centre shall ensure that the data referred to in paragraphs 1, 2 and 3, respectively, is stored no longer than is necessary for the transparency reporting referred to in Article 84. The data stored shall not contain any personal data.
Amendment 1875 #
Proposal for a regulation Article 83 – paragraph 4 4. The providers of hosting services
Amendment 1876 #
Proposal for a regulation Article 83 – paragraph 5 5. They shall ensure that the data is stored in a secure manner and that the storage is subject to appropriate technical and organisational safeguards. Those safeguards shall ensure, in particular, that the data can be accessed and processed only for the purpose for which it is stored, that a high level of security is achieved and that the information is deleted when no longer necessary for that purpose. All access to this data shall be logged and the logs securely stored for five years. They shall regularly review those safeguards and adjust them where necessary.
Amendment 1877 #
Proposal for a regulation Article 84 – paragraph 1 1. Each provider of relevant information society services shall draw up an annual report on its activities under this Regulation. That report shall compile the information referred to in Article 83(1). The providers shall, by 31 January of every year subsequent to the year to which the report relates, make the report available to the public in a machine-readable format and communicate it to the Coordinating Authority of establishment, the Commission and the EU Centre.
Amendment 1878 #
Proposal for a regulation Article 84 – paragraph 1 1. Each provider of relevant information society services shall draw up an annual report on its activities under this Regulation. That report shall compile the information referred to in Article 83(1). The providers shall, by 31
Amendment 1879 #
Proposal for a regulation Article 84 – paragraph 1 a (new) 1a. The annual report shall also include the following information: (a) the number and subject matter of detection orders and removal orders to act against alleged online child sexual abuse and the number of notifications received in accordance with Article 32 and the effects given to those orders; (b) the number of notifications and requests received pursuant to Articles 8a and 35a and an overview of their follow- up; (c) information on the effectiveness of the different technologies used and on the false positive and false negative rates of those technologies, as well as statistics on appeals and the effect they have on the users of its services and information of the effectiveness of the measures and obligations under Articles 3, 4, 5 and 7. (d) information on the tools used by the provider to become aware of the reported online child sexual abuse, including data and aggregate statistics on how technologies used by the provider work.
Amendment 1880 #
Proposal for a regulation Article 84 – paragraph 5 5. The annual transparency reports referred to in paragraphs 1, 2 and 3 shall not include any information that may prejudice ongoing activities for the assistance to
Amendment 1881 #
Proposal for a regulation Article 85 – paragraph 1 1. By [
Amendment 1882 #
Proposal for a regulation Article 85 – paragraph 2 2. By [
Amendment 1883 #
Proposal for a regulation Article 86 – paragraph 2 2. The power to adopt delegated acts referred to in Articles 3, 8, 13, 14, 17, 47 and 84 shall be conferred on the Commission for a
Amendment 1884 #
Proposal for a regulation Article 89 – paragraph 2 It shall apply from
Amendment 1885 #
Proposal for a regulation Article 89 – paragraph 3 This Regulation shall be binding in its entirety and directly applicable in all Member States. As from August 2024, if there is no entry into force of the proposed regulation, the regime in place shall be the one of the interim derogation, until such adoption is envisaged but no later than January 2025.
Amendment 1887 #
Proposal for a regulation Annex I – title DETECTION
Amendment 1888 #
Proposal for a regulation Annex I – title DETECTION
Amendment 1889 #
Proposal for a regulation Annex I – Section 1 – paragraph 2 – introductory part Name of the competent judicial authority
Amendment 1890 #
Proposal for a regulation Annex I – Section 4 – paragraph 2 – point 2 Amendment 1891 #
Proposal for a regulation Annex I – Section 4 – paragraph 2 – point 2 Amendment 1892 #
Proposal for a regulation Annex I – Section 4 – paragraph 2 – point 3 Amendment 1893 #
Proposal for a regulation Annex I – Section 4 – paragraph 2 – point 3 Amendment 1894 #
Proposal for a regulation Annex I – Section 4 – paragraph 3 Where the detection order concerns the solicitation of children, in accordance with Article 7(7), last subparagraph, of the Regulation, the detection order applies only to publicly available number independent interpersonal communications where one of the users is a child user, as defined in Article 2, point (i), of the Regulation.
Amendment 1895 #
Proposal for a regulation Annex II – title TEMPLATE FOR INFORMATION ABOUT THE IMPOSSIBILITY TO EXECUTE THE DETECTION
Amendment 1896 #
Proposal for a regulation Annex II – title TEMPLATE FOR INFORMATION ABOUT THE IMPOSSIBILITY TO EXECUTE THE DETECTION
Amendment 1897 #
Proposal for a regulation Annex III – Section 2 – point 2 – point 2 Amendment 1898 #
Proposal for a regulation Annex III – Section 2 – point 2 – point 2 Amendment 1899 #
Amendment 1900 #
Proposal for a regulation Annex III – Section 2 – point 2 – point 3 Amendment 1901 #
Proposal for a regulation Annex III – Section 2 – point 3 – introductory part 3)
Amendment 1902 #
Proposal for a regulation Annex III – Section 2 – point 3 – introductory part 3) Content data related to the reported potential online child sexual abuse, including images
Amendment 1903 #
Proposal for a regulation Annex III – Section 2 – point 4 Amendment 1904 #
Amendment 1905 #
Proposal for a regulation Annex III – Section 2 – point 4 Amendment 1906 #
Proposal for a regulation Annex VII Amendment 1907 #
Proposal for a regulation Annex VII Amendment 1908 #
Proposal for a regulation Annex VIII Amendment 1909 #
Proposal for a regulation Annex VIII Amendment 277 #
Proposal for a regulation – The European Parliament rejects the Commission proposal (COM(2022)0209).
Amendment 278 #
Proposal for a regulation – The European Parliament rejects the Commission proposal (COM(2022)0209).
Amendment 279 #
Proposal for a regulation – The European Parliament rejects the Commission proposal (COM(2022)0209).
Amendment 280 #
Proposal for a regulation – The European Parliament rejects the Commission proposal (COM(2022)0209).
Amendment 281 #
Proposal for a regulation Title 1 Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to
Amendment 282 #
Proposal for a regulation Citation 1 Having regard to the Treaty on the Functioning of the European Union, and in particular Article 16 and Article 114 thereof,
Amendment 283 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that often cause long-lasting negative consequences on victims that need to be prevented and combated effectively in order to protect children’s rights and well-
Amendment 284 #
Proposal for a regulation Recital 1 (1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union
Amendment 285 #
Proposal for a regulation Recital 1 a (new) (1a) When using artificial intelligence algorithms on images, it is well documented that bias and discrimination can occur due to the lack of representativeness of certain population groups in the data used to train the algorithm. These biases should be identified, measured and eradicated in order for the detection systems to be truly profitable to society as a whole.
Amendment 286 #
Proposal for a regulation Recital 1 a (new) (1a) It should be specified that over 60 % of the images of child sexual abuse circulating in the world are hosted in the EU, and that 1 out of 5 children in Europe are victims of sexual violence and abuse, reflecting the urgent need for rules to be laid down in the EU.
Amendment 287 #
Proposal for a regulation Recital 1 b (new) (1b) The use of end-to-end encryption should be promoted and, where necessary, be mandatory in accordance with the principles of security and privacy by design. Member States should not impose any obligation on encryption providers, on providers of relevant information society services or on any other organisations with regard to any level of the supply chain that would result in the weakening of the security of their networks and services, such as bypassing authentication and accessing encrypted data or creating deliberate weaknesses by providers to allow for access to encrypted data.
Amendment 288 #
Proposal for a regulation Recital 1 c (new) (1c) End-to-end encryption is an important tool to guarantee the security and confidentiality of communications of users, including those of children. Any weakening of encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or weakening end-to-end encryption.
Amendment 289 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services. Considering the importance of the right to privacy, including the protection of personal data, as guaranteed by the Charter of Fundamental Rights, nothing in this regulation should be interpreted in a way that would enable future broad based mass surveillance.
Amendment 290 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being
Amendment 291 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures t
Amendment 292 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual
Amendment 293 #
Proposal for a regulation Recital 2 (2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable, reliable and tangible measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent
Amendment 294 #
Proposal for a regulation Recital 2 a (new) (2a) A long-term solution should be envisaged, within a proportionate legal framework in which automated technology should be used to detect, in a secure manner, online sexual exploitation and abuse.
Amendment 295 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In
Amendment 296 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse and more generally to safeguard minors online , in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which sometimes diverge, may have a
Amendment 297 #
Proposal for a regulation Recital 3 (3) Member States and regional authorities are increasingly introducing, or are considering introducing, national and regional laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, may have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
Amendment 298 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws,
Amendment 299 #
Proposal for a regulation Recital 3 (3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, may have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
Amendment 300 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned.
Amendment 301 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is effective, proportionate and coherent with national legislations and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology- neutral and future-
Amendment 302 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is effective, well targeted and proportionate and that respects the fundamental rights and privacy of all parties concerned. In view of the fast- changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future-
Amendment 303 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is demonstrably and durably effective and that respects the fundamental rights of all parties concerned. In view of the fast- changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future-
Amendment 304 #
Proposal for a regulation Recital 4 (4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent
Amendment 305 #
Proposal for a regulation Recital 4 a (new) Amendment 306 #
Proposal for a regulation Recital 4 a (new) (4a) Protecting children online should not preclude respect for user privacy. The proposal should not impose a general control requirement but one of control in specific cases, based on the risk involved.
Amendment 307 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers
Amendment 308 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services
Amendment 309 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner without lowering child protection standards.
Amendment 310 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services that
Amendment 311 #
Proposal for a regulation Recital 5 (5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available number independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner.
Amendment 312 #
Proposal for a regulation Recital 6 (6) Online child sexual abuse frequently involves the misuse of information society services offered in the Union by providers established in third
Amendment 313 #
Proposal for a regulation Recital 6 (6) Online child sexual abuse
Amendment 314 #
Proposal for a regulation Recital 7 (7) This Regulation should be without
Amendment 315 #
Proposal for a regulation Recital 9 Amendment 316 #
Proposal for a regulation Recital 9 (9) Article 15(1) of Directive 2002/58/EC allows Member States to adopt legislative measures to restrict the scope of the rights and obligations provided for in certain specific provisions of that Directive relating to the confidentiality of communications when such restriction constitutes a necessary, appropriate and proportionate measure within a democratic society, inter alia, to prevent, investigate, detect and prosecute criminal offences, provided certain conditions are met, including compliance with the Charter, which, inter alia, requires the specific measures to be provided for by law and genuinely achieve objectives of general interest. Applying the requirements of that provision by analogy, this Regulation should limit the exercise of the rights and obligations provided for in Articles 5(1), (3) and 6(1) of Directive 2002/58/EC, insofar as strictly necessary in line with Article 52 of the Charter, to execute detection orders issued in accordance with this Regulation with a view to prevent and combat online child sexual abuse.
Amendment 317 #
Proposal for a regulation Recital 9 a (new) (9a) Encryption, and especially end-to- end encryption, is an increasingly important tool to guarantee the security and confidentiality of the communications of all users, including children. Any restrictions or undermining of any kind of encryption, de jure or de facto, can be used and abused by malicious third parties. Nothing in this Regulation should be interpreted as prohibiting providers of information society services from using any kind of encryption on any part of their services, restricting or, undermining or bypassing such encryption in the sense of being detrimental to users’ expectations of confidential and secure communication services. Providers of information society services should under no circumstances be prevented from providing their services using the highest standards of encryption, considering that such encryption is essential for trust in and security of the digital services.
Amendment 318 #
Proposal for a regulation Recital 11 (11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more
Amendment 319 #
Proposal for a regulation Recital 11 (11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number, in relation to population size, of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of a software application in the relevant national software application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1), point (c), of Regulation (EU) 1215/2012 of the European Parliament and of the Council44. Mere technical accessibility of a website from the Union
Amendment 320 #
Proposal for a regulation Recital 12 (12) For reasons of consistency and technological neutrality, the term ‘child sexual abuse material’ should for the purpose of this Regulation be defined as referring to any
Amendment 321 #
Proposal for a regulation Recital 13 (13) The term ‘online child sexual abuse’ should cover
Amendment 322 #
Proposal for a regulation Recital 13 (13) The term ‘online child sexual abuse’ should cover not only the dissemination of material previously detected and confirmed as constituting
Amendment 323 #
Proposal for a regulation Recital 13 a (new) (13a) In order to protect children, this Regulation should take into account the concerning hypersexualized use of children's images in adverstising campaigns and the increasing spread of cultural pseudo-pedophilia also fuelled by fundraising campaigns.
Amendment 324 #
Proposal for a regulation Recital 14 (14) With a view to minimising the risk that their services are misused for the dissemination of known or new child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available number- independent interpersonal communications services should assess
Amendment 325 #
Proposal for a regulation Recital 14 (14) With a view to minimising the risk that their services are misused for the dissemination of known or new child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available interpersonal communications services should assess such risk for each of the services that they offer in the Union. To guide their risk assessment, a non-
Amendment 326 #
Proposal for a regulation Recital 14 (14) With a view to minimising the risk that their services are misused for the dissemination of known or new child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available number independent interpersonal communications services should assess such risk for each of the services that they offer in the Union. To guide their risk assessment, a non- exhaustive list of elements to be taken into account should be provided. To allow for a full consideration of the specific characteristics of the services they offer, providers should be allowed to take account of additional elements where relevant. As risks evolve over time, in function of developments such as those related to technology and the manners in which the services in question are offered and used, it is appropriate to ensure that the risk assessment is updated regularly and
Amendment 327 #
Proposal for a regulation Recital 14 (14) With a view to minimising the risk that their services are misused for the dissemination of known
Amendment 328 #
Proposal for a regulation Recital 14 a (new) (14a) Given the severity of these crimes and the long-lasting negative consequences on the victims and the risk of revictimization as a result of the dissemination of known material, new material, as well as activities constituting the solicitation of children, it is essential that this Regulation provides specific obligations for providers of hosting service and providers of interpersonal communication services to prevent, detect, report and remove child sexual abuse material in all their services, including interpersonal communications services, which may also be covered by end-to-end encryption, in light of the prevalence of dissemination of child sexual abuse material, including the solicitation of children, in interpersonal communication services.
Amendment 329 #
Proposal for a regulation Recital 14 a (new) (14a) Given the severity of these crimes and the long-lasting negative consequences on the victims and the risk of revictimization as a result of the dissemination of known material, new material, as well as activities constituting the solicitation of children, it is essential that this Regulation provides specific obligations for providers of hosting services and providers of interpersonal communication services to prevent, detect, report, remove child sexual abuse material in all their services, including interpersonal communication services, which may also be covered by end-to-end encryption, in light of the prevalence of dissemination of child sexual abuse material, including the solicitation of children, in interpersonal communication services.
Amendment 330 #
Proposal for a regulation Recital 15 (15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] with respect to information that they store and disseminate to the public
Amendment 331 #
Proposal for a regulation Recital 15 (15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] with respect to information that they store and disseminate to the public. For the purposes of the present Regulation and in order to avoid unecessary burdens and duplications especially for SMEs, those providers may draw on such a risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as required by this Regulation.
Amendment 332 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU)
Amendment 333 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take effective and reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU)
Amendment 334 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable specific measures to mitigate
Amendment 335 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification
Amendment 336 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively,
Amendment 337 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive
Amendment 338 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number independent interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 339 #
Proposal for a regulation Recital 16 (16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number independent interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child,
Amendment 340 #
Proposal for a regulation Recital 16 a (new) (16a) To further prevent online child sexual abuse effectively, an emphasis should be placed on public awareness raising, including through easily understandable campaigns and in education with a focus on empowerment of young people to use the internet safely and to address societal factors that enable child sexual abuse, including harmful gender norms and broader issues of societal inequality; In addition awareness raising should focus on hotlines where young people can report what has happened to them, as well as to improve access to institutional reporting by police and social services and other authorities.
Amendment 341 #
Proposal for a regulation Recital 16 a (new) (16a) Risk prevention should include the possibility to use approved technology that detects sexual content while respecting fundamental rights safeguards, including in end-to-end encrypted environments.
Amendment 342 #
Proposal for a regulation Recital 17 (17)
Amendment 343 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory
Amendment 344 #
Proposal for a regulation Recital 17 (17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect and prevent online child sexual abuse in their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authority.
Amendment 345 #
(17a) While age verification tools may be one possible method of mitigating risk, many currently-known age verification methods create a risk of systemic violations of privacy and data protection. This includes, inter alia, the mass profiling of the users, the biometric analysis of the user’s face and/or voice, or the deployment of digital identification/certification system, none of which currently respects individuals’ fundamental rights sufficiently to justify its large-scale or mandatory deployment. Implementation of any of these measures by the providers of communication services would necessarily add another layer of interference with the rights and freedoms of the users, or unduly restrict access to services to people who appear younger or older than their actual age or people who do not have the necessary identification documents. As such, methods to verify or assess the age of users should not be mandatory, if used, be approached with caution and allow for alternatives, to ensure the protection of rights to privacy and data protection of all internet users in line with the GDPR, and to ensure that it remains possible for law- abiding internet users to remain anonymous.
Amendment 346 #
Proposal for a regulation Recital 17 b (new) (17b) Relying on providers for risk mitigation measures comes with inherent risks, as business models, technologies and crimes evolve continuously. As a result, clear targets, oversight, review and adaptation, led by national supervisory authorities are needed, to avoid measures becoming redundant, disproportionate, ineffective, counterproductive and outdated.
Amendment 347 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on
Amendment 348 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number independent interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such
Amendment 349 #
Proposal for a regulation Recital 18 (18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number independent interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices,
Amendment 350 #
Proposal for a regulation Recital 19 Amendment 351 #
Amendment 352 #
Proposal for a regulation Recital 20 Amendment 353 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when
Amendment 354 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, and where the provider refuses to cooperate with Coordinating Authorities and the Centre, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders as a last resort. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards.
Amendment 355 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse,
Amendment 356 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. Such orders should not apply to end-to-end encryption services. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.
Amendment 357 #
Proposal for a regulation Recital 20 (20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection
Amendment 358 #
Proposal for a regulation Recital 20 a (new) (20a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any weakening of encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be prohibiting or weakening end-to-end encryption or be interpreted in that way.
Amendment 359 #
Proposal for a regulation Recital 20 b (new) (20b) The use of end-to-end encryption should be promoted and, where necessary, be mandatory in accordance with the principles of security and privacy by design. Member States should not impose any obligation on encryption providers, on providers of electronic communications services or on any other organisations, at any level of the supply chain, that would result in the weakening of the security of their networks and services, such as the creation or facilitation of backdoors or any other functionality allowing disclosure of communications content to third parties.
Amendment 360 #
Proposal for a regulation Recital 20 c (new) (20c) The act of breaking encryption refers to the act of defeating or bypassing the encryption protocol used to secure a communication. Any access by any third- party that was not meant to access, read or edit the content of that communication that was supposed to be private and secure should be considered as undermining encryption.
Amendment 361 #
Proposal for a regulation Recital 20 d (new) (20d) The technologies used for the purpose of executing detection warrants should be in accordance with the state of the art in the industry and are the least privacy-intrusive, including with regard to the principle of data protection by design and by default pursuant to Regulation (EU) 2016/679.
Amendment 362 #
Proposal for a regulation Recital 21 Amendment 363 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those limits and safeguards, detection orders should
Amendment 364 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those limits and safeguards, detection
Amendment 365 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those limits and safeguards, detection orders should only be issued after a diligent and objective assessment leading to the finding of a significant risk of the specific service concerned being misused for a given type of online child sexual abuse covered by this Regulation. Such detection orders should as far as possible be restricted and specified, not calling for mass detection. One of the elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection order.
Amendment 366 #
Proposal for a regulation Recital 21 (21) Furthermore, as parts of those
Amendment 367 #
Proposal for a regulation Recital 22 Amendment 368 #
Proposal for a regulation Recital 22 (22) However, the
Amendment 369 #
Proposal for a regulation Recital 22 (22) However, the finding of such
Amendment 370 #
Proposal for a regulation Recital 22 (22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority
Amendment 371 #
Proposal for a regulation Recital 23 Amendment 372 #
Proposal for a regulation Recital 23 (23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted, has quantifiable targets, is limited in time and is specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively a
Amendment 373 #
Proposal for a regulation Recital 23 (23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is
Amendment 374 #
Proposal for a regulation Recital 23 (23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted and specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available number independent interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of
Amendment 375 #
Proposal for a regulation Recital 24 Amendment 376 #
Proposal for a regulation Recital 24 (24) The competent judicial authority
Amendment 377 #
Proposal for a regulation Recital 25 Amendment 378 #
Proposal for a regulation Recital 25 Amendment 379 #
Proposal for a regulation Recital 26 Amendment 380 #
Proposal for a regulation Recital 26 (26) Th
Amendment 381 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation.
Amendment 382 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available number independent interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral,
Amendment 383 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of
Amendment 384 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Nothing in this Regulation should therefore be interpreted as prohibiting end-to-end encryption or making it impossible. When
Amendment 385 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to
Amendment 386 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those
Amendment 387 #
Proposal for a regulation Recital 26 (26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders
Amendment 388 #
Proposal for a regulation Recital 26 a (new) (26a) Encryption is important to ensure the enjoyment of all human rights offline and online. Moreover, encryption technologies contribute in a fundamental way both to the respect for private life and confidentiality of communications, as well as to innovation and the growth of the digital economy, which relies on the high level of trust and confidence that such technologies provide. In the context of interpersonal communications, end-to- end encryption (‘E2EE’) is a crucial tool for ensuring the confidentiality of electronic communications, as it provides strong technical safeguards against access to the content of the communications by anyone other than the sender and the recipient(s), including by the provider. It should be noted that while E2EE is one of the most commonly used security measures in the context of electronic communications, other technical solutions (e.g., the use of other cryptographic schemes) might be or become equally important to secure and protect the confidentiality of digital communications. Thus, their use should not be prevented, circumvented or weakened either.
Amendment 389 #
Proposal for a regulation Recital 26 a (new) (26a) End-to-end encryption is an essential tool to guarantee the security, privacy and confidentiality of the communications between users, including those of children. Any weakening of the end-to-end encryption's effect could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communications shall be understood the processing of any data, that would compromise or put at risk the integrity and confidentiality of the aforementioned end-to-end encrypted content. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side-channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provide third party actors access to the end-to-end encrypted content and communications.
Amendment 390 #
Proposal for a regulation Recital 26 a (new) (26a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any weakening of the end-to-end encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or weakening end-to-end encryption. However, to the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, providers should be authorised by the competent judicial authority or another independent administrative authority to process metadata that can detect suspicious patterns of behaviour without having access to the content of the encrypted communication.
Amendment 391 #
Proposal for a regulation Recital 26 a (new) (26a) End-to-end encryption is vital for the security and privacy of the communications of users. The detection obligations set out in this regulation should therefore not apply to end-to-end encryption services, since it risks jeopardizing the integrity of such services. Consequently, the encryption should remain confidential without the possibility of side channel-leak mechanism built in from the service providers, which would endanger the privacy of users.
Amendment 392 #
Proposal for a regulation Recital 26 a (new) (26a) The act of ‘breaking’ encryption refers to the act of defeating or bypassing the encryption protocol used to secure a communication. Any access by any third- party that was not meant to access, read or edit the content of that communication that was supposed to be private and secure should be considered as bypassing encryption.
Amendment 393 #
Proposal for a regulation Recital 26 b (new) (26b) The principle of data protection by design and by default laid down in Article 25 of Regulation (EU) 2016/679 applies to the technologies regulated by the Proposal by virtue of law.
Amendment 394 #
Proposal for a regulation Recital 27 Amendment 395 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers
Amendment 396 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers detection technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detection orders addressed to them. The European Data Protection Board should be consulted on the use of those technologies and the ways in which they should be best deployed to ensure compliance with applicable rules of Union law on the protection of personal data. The advice of the European Data Protection Board should be taken into account by the EU Centre when compiling the lists of available technologies and also by the Commission when preparing guidelines regarding the
Amendment 397 #
Proposal for a regulation Recital 27 (27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers detection technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detection orders addressed to them. The European Data Protection Board
Amendment 398 #
Proposal for a regulation Recital 27 a (new) (27a) Since the consultation of the EDPB by the EU Center is a new task not foreseen under either Regulation 2016/679, Regulation 2018/1725 or Directive 2016/680, the EDPB budget and staffing should be adapted accordingly. The situation of national authorities, who too will be regularly consulted by service providers, should also reflect their increased responsibilities.
Amendment 399 #
Proposal for a regulation Recital 27 a (new) (27a) The Commission shall ensure in the draft general budget of the Union that the European Data Protection Board and European Data Protection Supervisor are provided with sufficient human, technical and financial resources, premises and infrastructure necessary for the effective performance of its tasks and exercise of its powers pursuant to this Regulation.
Amendment 400 #
Proposal for a regulation Recital 27 a (new) (27a) Due to the nature of child sexual abuse materials, the sharing of those contents does not stop at border. The competent authorities and the EU Centre should therefore have a cooperation procedure with the American NCMEC (The National Center for Missing and Exploited Children) to detect and remove those contents more effectively.
Amendment 401 #
Proposal for a regulation Recital 27 a (new) (27a) To the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, it should be possible for the Coordinating Authority of establishment to authorise providers to process metadata.
Amendment 402 #
Proposal for a regulation Recital 28 Amendment 403 #
Proposal for a regulation Recital 28 (28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently
Amendment 404 #
Proposal for a regulation Recital 28 (28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliable,
Amendment 405 #
Proposal for a regulation Recital 28 (28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliable, as well as to identify false positives and avoid to the extent erroneous reporting to the EU Centre, providers should ensure human oversight and, where necessary, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential
Amendment 406 #
Proposal for a regulation Recital 28 (28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliable, as well as to identify false positives and avoid to the extent erroneous reporting to the EU Centre, providers should ensure human oversight and, where necessary, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available number independent interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential solicitation of children.
Amendment 407 #
(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that they may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potential online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonable grounds to believe that a particular activity may constitute online child sexual abuse. In such a case, hosting providers and providers of publicly available interpersonal communication services should be required to secure the disclosed child sexual abuse material and any metadata they hold about that material, including metadata which may indicate the author of the file, the time and circumstances of its creation and the modifications made. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest against child sexual abuse, or activities conducted on the providers’ own initiative. Those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them. In the event of an investigation, providers should provide any electronic evidence in their possession, as indicated above, upon request by law enforcement authorities.
Amendment 408 #
Proposal for a regulation Recital 29 (29) Providers of hosting services and providers of publicly available number independent interpersonal communications services
Amendment 409 #
Proposal for a regulation Recital 29 (29) Providers of hosting services and providers of publicly available
Amendment 410 #
Proposal for a regulation Recital 29 (29)
Amendment 411 #
Proposal for a regulation Recital 29 (29) Providers of hosting services and providers of publicly available interpersonal communications services are
Amendment 412 #
Proposal for a regulation Recital 29 a (new) (29a) It is also crucial that hosting providers and providers of publicly available interpersonal communication services cooperate with law enforcement in relation to the detection of potential online child abuse and the possession of key electronic evidence necessary for the proper prosecution of child sexual abuse cases. Therefore, in order to ensure the effective use of secured child sexual abuse material, it is necessary to legally ensure that providers secure not only the media files and instant messaging content themselves, but also their metadata. Metadata is information about documents/files relating to their content, technical and physical parameters. It also includes information such as the time and place of their creation, information about the devices used in their creation, and about the modifications made to the files. It is reasonable to expect service providers, in the event of the disclosure of child sexual abuse content, to secure it and then hand over, at the request of law enforcement authorities, any data indicated above that constitute electronic evidence in the case. It should be stressed that metadata can constitute important evidence, which will be important for law enforcement in the course of an investigation, and its ephemeral and easily modifiable nature requires it to be secured immediately, as it can contribute to the identification not only of the perpetrator and other persons linked to the uploaded content, but also of the victims
Amendment 413 #
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection
Amendment 414 #
Proposal for a regulation Recital 30 (30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities
Amendment 415 #
Proposal for a regulation Recital 30 (30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection and in order to stop or limit its dissemination, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 416 #
Proposal for a regulation Recital 30 (30) To ensure that online child sexual abuse material is removed as swiftly as possible
Amendment 417 #
Proposal for a regulation Recital 30 (30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities
Amendment 418 #
Proposal for a regulation Recital 32 Amendment 419 #
Proposal for a regulation Recital 32 (32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited.
Amendment 420 #
Proposal for a regulation Recital 33 Amendment 421 #
Proposal for a regulation Recital 33 Amendment 422 #
Proposal for a regulation Recital 35 (35)
Amendment 423 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the
Amendment 424 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that
Amendment 425 #
Proposal for a regulation Recital 35 (35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available number independent interpersonal communications services in accordance with this Regulation.
Amendment 426 #
Proposal for a regulation Recital 35 a (new) Amendment 427 #
Proposal for a regulation Recital 36 (36) In order to prevent children falling victim to online abuse, providers for which there is evidence that their service is routinely or systematically used for the purpose of online child sexual abuse should provide reasonable assistance, by putting in place alert and alarm mechanisms in a prominent way on their platforms. The alert mechanism could consist of, for example, linking potential victims to the local services such as helplines, victims` rights and support organisations or hotlines. They should ensure adequate follow-up, when a report or alert is made, in the language chosen by the user for using their service. Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to
Amendment 428 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. Providers should create and run an accesible, age-appropriate and user- friendly mechanism allowing users to flag any instances of potential online child sexual abuse on their platform. The providers should also offer reasonable assistance to the users who report these cases, such as implementing visible alert and alarm systems on their platforms, as well as providing links to local organizations such as hotlines, helplines, or victims' rights organizations, to assist potential victims.
Amendment 429 #
(36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims or their legal representatives who request the removal or disabling of access of the material in question. That assistance should be user friendly and remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by
Amendment 430 #
Proposal for a regulation Recital 36 (36) Given the impact on the rights of
Amendment 431 #
Proposal for a regulation Recital 37 (37) To ensure the efficient management of such
Amendment 432 #
Proposal for a regulation Recital 38 (38) For the purpose of facilitating the exercise of the
Amendment 433 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one
Amendment 434 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to the application of this Regulation, without prejudice to the enforcement powers of
Amendment 435 #
Proposal for a regulation Recital 44 (44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority
Amendment 436 #
Proposal for a regulation Recital 48 (48) Given the need to ensure the effectiveness of the obligations imposed, Coordinating Authorities should be granted enforcement powers to address infringements of this Regulation.
Amendment 437 #
Proposal for a regulation Recital 49 (49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders
Amendment 438 #
Proposal for a regulation Recital 49 (49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, removal orders or blocking orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, and reacting timely to the evolving trends of child sexual abuse material dissemination and monetisation, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.
Amendment 439 #
Proposal for a regulation Recital 49 (49) In order to verify that the rules of this Regulation, in particular those on
Amendment 440 #
Proposal for a regulation Recital 49 (49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of voluntary detection orders,
Amendment 441 #
Proposal for a regulation Recital 49 a (new) (49a) Detection orders, which would require communication service providers to monitor their users' online activities for the purpose of detecting child sexual abuse material (CSAM), should only be imposed as a last resort in cases where a provider is found to be acting in bad faith and failing to cooperate with competent authorities. The use of detection orders should be proportionate, necessary, and subject to strict safeguards, and should only be authorized by a judicial authority or other independent oversight body. In any case, users should not be punished for merely using a communication service, and any measures taken to detect or remove CSAM should be implemented in a manner that respects users' privacy and other fundamental rights.
Amendment 442 #
Proposal for a regulation Recital 50 (50) With a view to ensuring that providers of hosting services are aware of the misuse made of their services and to
Amendment 443 #
Proposal for a regulation Recital 50 (50) With a view to ensuring that providers of hosting services are aware of the misuse made of their services and to afford them an opportunity to take expeditious action to remove
Amendment 444 #
Proposal for a regulation Recital 55 Amendment 445 #
Proposal for a regulation Recital 55 (55) It is essential for the proper functioning of
Amendment 446 #
Proposal for a regulation Recital 55 (55) It is essential for the proper functioning of the system of mandatory detection
Amendment 447 #
Proposal for a regulation Recital 55 a (new) (55a) All communications containing illegal material should be encrypted to state of the art standards, all access by staff to such content should be limited to what is necessary and thoroughly logged. All such logs should be stored for a minimum of ten years.
Amendment 448 #
Proposal for a regulation Recital 56 Amendment 449 #
Proposal for a regulation Recital 56 (56) With a view to ensuring that the indicators generated by the EU Centre
Amendment 450 #
Proposal for a regulation Recital 56 (56) With a view to ensuring that the indicators generated by the EU Centre for the purpose of detection are as complete as possible, the submission of relevant material and transcripts should be done proactively by the Coordinating Authorities. However, the EU Centre should also be allowed to bring certain material
Amendment 451 #
Proposal for a regulation Recital 57 (57) Certain providers of relevant information society services offer their services in several or even all Member States, whilst under this Regulation only a single Member State has jurisdiction in respect of a given provider. It is therefore imperative that the Coordinating Authority designated by the Member State having jurisdiction takes account of the interests of all users in the Union when performing its tasks and using its powers, without making any distinction depending on elements such as the users’ location or nationality, and that Coordinating Authorities cooperate with each other in an effective and efficient manner. To facilitate such cooperation, the necessary mechanisms and information- sharing systems should be provided for. That cooperation shall be without prejudice to the possibility for Member States to provide for regular exchanges of views with other public authorities where relevant for the performance of the tasks of those other authorities and of the Coordinating Authority and receive reports concerning the trends in the dissemination and monetisation of child sexual abuse material from relevant organisations acting in the public interest against child sexual abuse and other stakeholders, including service providers.
Amendment 452 #
Proposal for a regulation Recital 58 (58) In particular, in order to facilitate the cooperation needed for the proper functioning of the mechanisms set up by this Regulation, the EU Centre should establish and maintain the necessary information-sharing systems.
Amendment 453 #
Proposal for a regulation Recital 58 (58) In particular, in order to facilitate the cooperation needed for the proper functioning of the mechanisms set up by this Regulation, the EU Centre should establish and maintain the necessary secure information-sharing systems. When establishing and maintaining such systems, the EU Centre should cooperate with the European Union Agency for Law Enforcement Cooperation (‘Europol’) and national authorities to build on existing systems and best practices, where relevant.
Amendment 454 #
Proposal for a regulation Recital 59 (59) To support the implementation of this Regulation and contribute to the achievement of its objectives, the EU Centre should serve as a central facilitator, carrying out a range of specific tasks. The performance of those tasks requires strong guarantees of independence, in particular from law enforcement authorities, including Europol, as well as a governance structure ensuring the effective, efficient and coherent performance of its different tasks, and legal personality to be able to interact effectively with all relevant stakeholders. Therefore, it should be established as a decentralised Union agency.
Amendment 455 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of
Amendment 456 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available number independent interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of hosting services, the provision of assistance to Coordinating Authorities, as well as the generation and sharing of knowledge and expertise related to online
Amendment 457 #
Proposal for a regulation Recital 60 (60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the
Amendment 458 #
Proposal for a regulation Recital 61 Amendment 459 #
Proposal for a regulation Recital 61 (61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself.
Amendment 460 #
Proposal for a regulation Recital 61 a (new) (61a) The EU Centre should be charged with the provision of assistance to Coordinating Authorities, as well as the generation of research, prevention techniques and sharing of knowledge, best practices and expertise related to online child sexual abuse, successful initiatives on digital skills and competences in an age appropriate manner, including media literacy, on sex education, and reacting timely to the evolving trends of child sexual abuse material dissemination.
Amendment 461 #
Proposal for a regulation Recital 62 Amendment 462 #
Proposal for a regulation Recital 63 (63) For the purpose of ensuring the traceability of the reporting process and of any follow-up activity undertaken based on reporting, as well as of allowing for the provision of feedback on reporting to providers of hosting services and providers of publicly available number independent interpersonal communications services, generating statistics concerning reports and the reliable and swift management and processing of reports, the EU Centre should create a dedicated database of such reports. To be able to fulfil the above purposes, that database should also contain relevant information relating to those reports, such as the indicators representing the material and ancillary tags, which can indicate, for example, the fact that a reported image or video is part of a series of images and videos depicting the same
Amendment 463 #
Proposal for a regulation Recital 63 (63) For the purpose of ensuring the traceability of the reporting process and of any follow-up activity undertaken based on reporting, as well as of allowing for the provision of feedback on reporting to providers of hosting services and providers of publicly available interpersonal communications services, generating statistics concerning reports and the reliable and swift management and processing of reports, the EU Centre should create a dedicated database of such reports. To be able to fulfil the above purposes, that database should also contain relevant information relating to those reports, such as the indicators representing the material and ancillary tags, which can indicate, for example, the fact that a reported image or video is part of a series of images and videos depicting the same
Amendment 464 #
Proposal for a regulation Recital 64 (64) Given the sensitivity of the data
Amendment 465 #
Proposal for a regulation Recital 64 (64) Given the sensitivity of the data concerned and with a view to avoiding any errors and possible misuse, it is necessary to lay down strict rules on the access to those databases of indicators and databases of reports, on the data contained therein and on their security. Such rules should be always handled by staff specifically trained for that purpose. In particular, the data concerned should not be stored for longer than is strictly necessary. For the above reasons, access to the database of indicators should be given only to the parties and for the purposes specified in this Regulation, subject to the controls by the EU Centre, and be limited in time and in scope to what is strictly necessary for those purposes.
Amendment 466 #
Proposal for a regulation Recital 65 (65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks without receiving an overwhelming quantity of false positives, reports should pass through the EU Centre. The EU Centre should thoroughly assess those reports in order to identify those that are
Amendment 467 #
Proposal for a regulation Recital 65 (65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks, reports should pass through the EU Centre. The EU Centre should assess those reports in order to identify those that are manifestly unfounded, that is, where it is immediately evident, without any substantive legal or factual analysis, that the reported activities do not constitute online child sexual abuse. Where the report is manifestly unfounded, the EU Centre should provide feedback to the reporting provider of hosting services or provider of publicly available number independent interpersonal communications services in order to allow for improvements in the technologies and processes used and for other appropriate steps, such as reinstating material wrongly removed. As every report could be an important means to investigate and prosecute the child sexual abuse offences concerned and to rescue the victim of the abuse, reports should be processed as quickly as possible.
Amendment 468 #
Proposal for a regulation Recital 65 (65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks, reports should pass through the EU Centre. The EU
Amendment 469 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of hosting services for the dissemination of known child sexual abuse material that is publicly accessible, using the corresponding indicators. Where it identifies such material after having conducted such a search, the EU Centre should also
Amendment 470 #
Proposal for a regulation Recital 66 (66) With a view to contributing to the effective application of this Regulation and the protection of
Amendment 471 #
Proposal for a regulation Recital 67 (67) Given its central position resulting
Amendment 472 #
Proposal for a regulation Recital 67 (67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse, including lessons learned from, prevention and awareness raising campaigns. In this connection, the EU Centre should cooperate with relevant stakeholders
Amendment 473 #
Proposal for a regulation Recital 68 (68) Processing and storing certain personal data is necessary for the performance of the EU Centre’s tasks under this Regulation. In order to ensure that such personal data is adequately protected, the EU Centre should only process and store personal data if strictly necessary for the purposes detailed in this Regulation. It should do so in a secure manner, use state of the art encryption, and limit storage to what is strictly necessary for the performance of the relevant tasks. It should ensure adequate protection of its infrastructure and implement facilities access control, storage control, user control, control of data entry, data access control, communication control, input control, transport control, personnel profiles procedures, incident and recovery procedures, and ensure the reliability and integrity of its databases.
Amendment 474 #
Proposal for a regulation Recital 69 (69) In order to allow for the effective and efficient performance of its tasks, the EU Centre should closely cooperate with Coordinating Authorities,
Amendment 475 #
Proposal for a regulation Recital 69 a (new) (69a) In order to strenghten prevention measures, the EU Centre shall cooperate with relevant national authorities to identify possible new patterns regarding child sexual abuse material as well as to lay down and disseminate contents to raise awareness and prevent child sexual abuse.
Amendment 476 #
Proposal for a regulation Recital 69 b (new) (69b) To make the best use of survivors knowledge, the EU Centre shall consult with victims organisations and helplines to set up and improve victims support mechanisms as well as to analyse and explore new forms of online sexual abuses.
Amendment 477 #
Proposal for a regulation Recital 70 (70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. This role played by hotlines should be reinforced and they should continue to facilitate this fight. Each Member State should ensure that at least one official hotline is operating in its territory. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across
Amendment 478 #
Proposal for a regulation Recital 71 Amendment 479 #
Proposal for a regulation Recital 72 Amendment 480 #
Proposal for a regulation Recital 72 Amendment 481 #
Proposal for a regulation Recital 72 (72)
Amendment 482 #
Proposal for a regulation Recital 72 (72)
Amendment 483 #
Proposal for a regulation Recital 72 a (new) (72a) Considering that a relevant number of child sexual material online present in the internal market is produced in Third countries, the EU Centre should cooperate with competent services in relevant international forum to prevent child sexual abuses at global level.
Amendment 484 #
Proposal for a regulation Recital 74 (74) In view of the need for technical expertise in order to perform its tasks,
Amendment 485 #
Proposal for a regulation Recital 74 (74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology, and the evolution of those technologies and developping new ones.
Amendment 486 #
Proposal for a regulation Recital 74 (74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU
Amendment 487 #
Proposal for a regulation Recital 74 a (new) (74a) The Technology Committee could therefore establish a certification for technologies which could be used by online service providers to detect child sexual abuse material on their request.
Amendment 488 #
Proposal for a regulation Recital 75 Amendment 489 #
Proposal for a regulation Recital 75 (75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available number independent interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question in gathering that information.
Amendment 490 #
Proposal for a regulation Recital 75 (75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse information, based on
Amendment 491 #
Proposal for a regulation Recital 76 (76) In the interest of good governance and drawing on the statistics and information gathered and transparency reporting mechanisms provided for in this Regulation, the Commission should carry out an evaluation of this Regulation within
Amendment 492 #
Proposal for a regulation Recital 77 (77) The evaluation should be based on the criteria of efficiency, necessity, effectiveness, proportionality, relevance, coherence and Union added value. It should assess the functioning of the different operational and technical measures provided for by this Regulation
Amendment 493 #
Proposal for a regulation Recital 78 (78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45provides for a temporary solution in respect of the use of technologies by certain providers of publicly available number independentinterpersonal communications services for the purpose of combating online child sexual abuse, pending the preparation and adoption of a long-term legal framework. This Regulation provides that long-term legal framework. Regulation (EU) 2021/1232 should therefore be repealed. _________________ 45 Regulation (EU) 2021/1232 of the
Amendment 494 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse, in
Amendment 495 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to address the
Amendment 496 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market
Amendment 497 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to prevent and address the misuse of relevant information society services for online child sexual abuse in the internal market.
Amendment 498 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 1 This Regulation lays down uniform rules to prevent and address the
Amendment 499 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services and providers of interpersonal communication services to detect and report online child sexual abuse where there is reasonable cause to suspect such illegal behaviour;
Amendment 500 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services and providers of publicly available number-independent interpersonal communication services to detect and report online child sexual abuse in specific cases;
Amendment 501 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services and providers of number independent interpersonal communications services to
Amendment 502 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services and providers of number- independent interpersonal communication services to detect and report online child sexual abuse;
Amendment 503 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services and providers of number- independent interpersonal communication services to detect and report online child sexual abuse;
Amendment 504 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point b (b) obligations on providers of hosting services and providers of interpersonal communication services to
Amendment 505 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point c (c) obligations on providers of hosting services to remove or disable access to known child sexual abuse material on their services;
Amendment 506 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point c (c) obligations on providers of hosting services to remove
Amendment 507 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point c (c) obligations on providers of hosting services to remove
Amendment 508 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point c (c) obligations on providers of hosting services to remove
Amendment 509 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point c (c) obligations on providers of hosting services to remove
Amendment 510 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d Amendment 511 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d Amendment 512 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d Amendment 513 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d Amendment 514 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d (d) obligations on providers of internet access services to disable access to known child sexual abuse material;
Amendment 515 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d a (new) (da) obligations on providers of online search engines and any other artificial intelligence systems to delist or disable specific items of child sexual abuse, or both;
Amendment 516 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point d a (new) (da) obligations on providers of online search engines to delist websites which were determined to host child sexual abuse material;
Amendment 517 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point e (e) rules on the implementation and enforcement of this Regulation, including as regards the designation and functioning of the competent authorities of the Member States
Amendment 518 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point e a (new) (ea) rules on the designation, functioning, cooperation, transparency and powers of the EU Centre on Child Sexual Abuse established in Article 40 (‘EU Centre’);
Amendment 519 #
Proposal for a regulation Article 1 – paragraph 1 – subparagraph 2 – point e a (new) (ea) Obligations on providers of online games.
Amendment 520 #
Proposal for a regulation Article 1 – paragraph 2 a (new) 2a. This Regulation shall only apply to services normally provided for remuneration.
Amendment 521 #
Proposal for a regulation Article 1 – paragraph 2 b (new) 2b. This Regulation does not apply to audio communications.
Amendment 522 #
Proposal for a regulation Article 1 – paragraph 3 – point b (b) Directive 2000/31/EC and Regulation (EU)
Amendment 523 #
Proposal for a regulation Article 1 – paragraph 3 – point b a (new) (ba) Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online;
Amendment 524 #
Proposal for a regulation Article 1 – paragraph 3 – point d (d) Regulation (EU) 2016/679, Directive 2016/680, Regulation (EU) 2018/1725
Amendment 525 #
Proposal for a regulation Article 1 – paragraph 3 – point d (d) Regulation (EU) 2016/679, Directive 2016/680, Regulation (EU) 2018/1725
Amendment 526 #
Proposal for a regulation Article 1 – paragraph 3 – point d (d) Regulation (EU) 2016/679, Directive 2016/680, Regulation (EU) 2018/1725, and
Amendment 527 #
Proposal for a regulation Article 1 – paragraph 3 – point d a (new) (da) Directive (EU) 2022/2555 of the European Parliament and of the Council of 14 December 2022 on measures for a high common level of cybersecurity across the Union, amending Regulation (EU) No 910/2014 and Directive (EU) 2018/1972, and repealing Directive (EU) 2016/1148 (NIS 2 Directive)"
Amendment 528 #
Proposal for a regulation Article 1 – paragraph 3 – point d a (new) (da) Directive (EU) 2022/2555 of the European Parliament and the Council of 14 December 2022 on measures for high common level of cybercecurity across the Union, amending Regulation (EU) No 910/2014 and Directive (EU) 2018/1972 and repealing Directive (EU) 2016/1148 (NIS 2 Directive);
Amendment 529 #
Proposal for a regulation Article 1 – paragraph 3 – point d a (new) (da) Regulation (EU) …/… [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act);
Amendment 530 #
Proposal for a regulation Article 1 – paragraph 3 a (new) 3a. Nothing in this Regulation shall be interpreted as prohibiting, restricting or undermining, including de-facto, the provision or use of encrypted and end-to- end encrypted services. Providers shall not in particular be prohibited or discouraged from offering end-to-end encrypted services, and the provision of such services shall not be made, including de-facto, difficult, financially unsustainable, or impossible.
Amendment 531 #
Proposal for a regulation Article 1 – paragraph 3 a (new) 3a. This regulation shall not have the effect of modifying the obligation to respect the rights, freedom and principles referred to in Article 6 TEU and shall apply without prejudice to fundamental principles relating to the right to private life and family life and to freedom of expression and information;
Amendment 532 #
Proposal for a regulation Article 1 – paragraph 3 a (new) 3a. This Regulation shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption, or be interpreted in that way.
Amendment 533 #
Proposal for a regulation Article 1 – paragraph 3 b (new) 3b. This Regulation shall be without prejudice to the rules on professional secrecy under national law, such as rules on the protection of professional communications, between doctors and their patients, between journalists and their sources, or between lawyers and their clients, in particular since the confidentiality of communications between lawyers and their clients is key to ensuring the effective exercise of the rights of the defence as an essential part of the right to a fair trial.
Amendment 534 #
Proposal for a regulation Article 1 – paragraph 3 b (new) 3b. This Regulation shall not undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.
Amendment 535 #
Proposal for a regulation Article 1 – paragraph 3 b (new) 3b. Nothing in this Regulation shall be interpreted as prohibiting or weakening end-to-end encryption.
Amendment 536 #
Proposal for a regulation Article 1 – paragraph 3 c (new) 3c. This Regulation does not provide for a lawful basis for the processing of personal data for the sole purpose of detecting child sexual abuse on a voluntary basis.
Amendment 537 #
Proposal for a regulation Article 1 – paragraph 4 Amendment 538 #
Proposal for a regulation Article 1 – paragraph 4 4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC with the sole objective of enabling providers of number- independent interpersonal communications services, without prejudice to Regulation (EU) 2016/679, to use specific technologies for the processing of personal data to the extent strictly necessary to detect and report child sexual abuse material and remove child sexual abuse material on their services insofar as necessary for the execution of the detection
Amendment 539 #
Proposal for a regulation Article 1 – paragraph 4 4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC
Amendment 540 #
Proposal for a regulation Article 1 – paragraph 4 a (new) 4a. To ensure fundamental rights laid down in the European Union's, the Council of Europe's and the United Nation's human rights charters, core fundaments of our democratic society and the rule of law - citizens' right to privacy and private correspondence must be upheld. Therefore, detection orders can only be issued towards persons suspected of criminal activity. There shall be no general monitoring of ordinary law- abiding citizens and users of interpersonal communication services private messages.
Amendment 541 #
Proposal for a regulation Article 1 – paragraph 4 a (new) 4a. This Regulation does not apply to audio communications.
Amendment 542 #
Proposal for a regulation Article 1 – paragraph 4 a (new) 4a. This Regulation does not apply to audio communications.
Amendment 543 #
Proposal for a regulation Article 1 – paragraph 4 b (new) 4b. This Regulation does not apply to text communications.
Amendment 544 #
Proposal for a regulation Article 2 – paragraph 1 – point a (a) ‘hosting service’ means a
Amendment 545 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘number independent interpersonal communications service
Amendment 546 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘interpersonal communications service’ means a publicly available service as defined in Article 2, point 5, of Directive (EU) 2018/1972
Amendment 547 #
Proposal for a regulation Article 2 – paragraph 1 – point b (b) ‘number-independent interpersonal communications service’ means a publicly available service as defined in Article 2, point
Amendment 548 #
Proposal for a regulation Article 2 – paragraph 1 – point b a (new) (ba) ‘number-independent interpersonal communications service within games’ means any service defined in Article 2, point 7 of Directive (EU) 2018/1972 which is part of a game;
Amendment 549 #
Proposal for a regulation Article 2 – paragraph 1 – point b a (new) (ba) ‘number-independent interpersonal communications service’ means a publicly available service as defined in Article 2, point 7, of Directive (EU) 2018/1972;
Amendment 550 #
Proposal for a regulation Article 2 – paragraph 1 – point d Amendment 551 #
Proposal for a regulation Article 2 – paragraph 1 – point d Amendment 552 #
Proposal for a regulation Article 2 – paragraph 1 – point e Amendment 553 #
Proposal for a regulation Article 2 – paragraph 1 – point e Amendment 554 #
Proposal for a regulation Article 2 – paragraph 1 – point e a (new) (ea) ‘artificial intelligence system’ means software as defined in Article 3(1) of Regulation (EU) …/… [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act);
Amendment 555 #
Proposal for a regulation Article 2 – paragraph 1 – point e a (new) (ea) “online search engine” means an intermediary service as defined in Article 3, point (j), of Regulation (EU) 2022/2065;
Amendment 556 #
Proposal for a regulation Article 2 – paragraph 1 – point e b (new) (eb) ‘intermediary service’ means a service as defined in Article 3, point (g), of Regulation (EU) 2022/2065;
Amendment 557 #
Proposal for a regulation Article 2 – paragraph 1 – point e c (new) (ec) ‘artificial intelligence system’ (AI system) means software as defined in Article 3(1) of Regulation (EU) .../... on Artificial Intelligence (Artificial Intelligence Act);
Amendment 558 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point ii (ii) a
Amendment 559 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point ii (ii) a
Amendment 560 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point ii (ii) a
Amendment 561 #
(ii)
Amendment 562 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iii Amendment 563 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iii Amendment 564 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iii Amendment 565 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iii Amendment 566 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iii a (new) (iiia) online games;
Amendment 567 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv Amendment 568 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv a (new) (iva) an artificial intelligence system;
Amendment 569 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv a (new) (iva) an online search engine;
Amendment 570 #
Proposal for a regulation Article 2 – paragraph 1 – point f – point iv b (new) (ivb) an artificial intelligence system.
Amendment 571 #
Proposal for a regulation Article 2 – paragraph 1 – point h a (new) (ha) (ha) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous report from the public about alleged child sexual abuse material and online child sexual exploitation, which is officially recognised by the Member State of establishment as expressed in Directive 2011/93/EU and its articles of association mention the mission of combatting child sexual abuse;
Amendment 572 #
Proposal for a regulation Article 2 – paragraph 1 – point h b (new) (hb) ‘help-line’ means an organisation providing services for children in need as recognised by the Member State of establishment in line with Directive 2011/93/EU;
Amendment 573 #
Proposal for a regulation Article 2 – paragraph 1 – point i (i) ‘child’ means any natural person below the age of consent as regulated in the respective Member States, but at least below the age of 18 years;
Amendment 574 #
Proposal for a regulation Article 2 – paragraph 1 – point i a (new) (ia) "adult" means any natural person above the age of 18 years;
Amendment 575 #
Proposal for a regulation Article 2 – paragraph 1 – point j Amendment 576 #
Proposal for a regulation Article 2 – paragraph 1 – point j Amendment 577 #
Proposal for a regulation Article 2 – paragraph 1 – point j Amendment 578 #
Proposal for a regulation Article 2 – paragraph 1 – point j Amendment 579 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of consent as regulated in the respective Member States, but at least below the age of 17 years;
Amendment 580 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a
Amendment 581 #
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 1
Amendment 582 #
Proposal for a regulation Article 2 – paragraph 1 – point j (j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 1
Amendment 583 #
Proposal for a regulation Article 2 – paragraph 1 – point j a (new) (ja) "adult user" means a natural person who uses a relevant information society service and who is a natural person above the age of 18 years;
Amendment 584 #
Proposal for a regulation Article 2 – paragraph 1 – point l (l) ‘child sexual abuse material’ means any material
Amendment 585 #
Proposal for a regulation Article 2 – paragraph 1 – point m (m) ‘known child sexual abuse material’ means potential child sexual abuse material detected
Amendment 586 #
Proposal for a regulation Article 2 – paragraph 1 – point n Amendment 587 #
Proposal for a regulation Article 2 – paragraph 1 – point n Amendment 588 #
Proposal for a regulation Article 2 – paragraph 1 – point o Amendment 589 #
Proposal for a regulation Article 2 – paragraph 1 – point p (p) ‘online child sexual abuse’ means the online dissemination of child sexual abuse material and the solicitation of children, including the exposure of children to pornographic content online;
Amendment 590 #
Proposal for a regulation Article 2 – paragraph 1 – point p (p) ‘online child sexual abuse’ means the online dissemination of child sexual abuse material including self-generated material disseminated without consent and the solicitation of children;
Amendment 591 #
Proposal for a regulation Article 2 – paragraph 1 – point p (p) ‘online child sexual abuse’ means the online dissemination of child sexual abuse material
Amendment 592 #
Proposal for a regulation Article 2 – paragraph 1 – point q (q) ‘child sexual abuse offences’ means offences as defined in Articles 3 to 7 of Directive 2011/93/EU, and, for the scope of this regulation, extends the offense referred to in Article 3, paragraph 2 of the same directive, to the witnessing of sexual activities online, even without having to participate;
Amendment 593 #
Proposal for a regulation Article 2 – paragraph 1 – point q a (new) (qa) (q a) ‘victim’ means a person residing in the European Union who being under 18 suffered child sexual abuse offences. For the purpose of exercising the victim’s rights recognised in this Regulation, parents and guardians, as well as any person who was under 18 at the time the material was made, whose material has been hosted or disseminated in the European Union, are to be considered victims;
Amendment 594 #
Proposal for a regulation Article 2 – paragraph 1 – point q a (new) (qa) “person suspected of being involved in child sexual abuse” means an identified individual person about whom verifiable adequate evidence exists, which gives rise to the suspicion that that person has committed a child sexual abuse offence, attempted to commit a child sexual abuse offence, or prepared by committing a criminal offence to commit a child sexual abuse offence;
Amendment 595 #
Proposal for a regulation Article 2 – paragraph 1 – point q a (new) (qa) ‘child survivor’ means a person as defined in Article 2(1) point (a) of Directive 2011/93/EU who is below 18 years of age and suffered child sexual abuse offences;
Amendment 596 #
Proposal for a regulation Article 2 – paragraph 1 – point q b (new) (qb) 'person disqualified from exercising activities involving children' means an identified individual person, who, in line with Article 10 of Directive 2011/93/EU, is temporarily or permanenently disqualified from exercising activities involving direct and regular contacts with children;
Amendment 597 #
Proposal for a regulation Article 2 – paragraph 1 – point q b (new) (qb) 'survivor' means a person as defined in Article 2(1) point (a) of Directive 2011/93/EU who suffered child sexual abuse offences;
Amendment 598 #
Proposal for a regulation Article 2 – paragraph 1 – point s Amendment 599 #
Proposal for a regulation Article 2 – paragraph 1 – point s (s) ‘content data’ means
Amendment 600 #
Proposal for a regulation Article 2 – paragraph 1 – point s (s) ‘content data’ means
Amendment 601 #
Proposal for a regulation Article 2 – paragraph 1 – point w (w) ‘main establishment’ means the
Amendment 602 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (wa) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous complaints from the public about alleged child sexual abuse material and online child sexual exploitation, which meets the following criteria: (a) is officially recognised by its home Member State as expressed in the Directive 2011/93/EU of the European Parliament and of the Council; (b) has the mission of combatting child sexual abuse material in its articles of association; and (c) is part of a recognised and well-established international network;
Amendment 603 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (wa) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous information from the public about potential child sexual abuse material and online child sexual exploitation, which is officially recognised by its home Member State as expressed in the Directive 2011/93/EU of the European Parliament and of the Council and has the mission of combatting child sexual abuse material in its articles of association;
Amendment 604 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (wa) 'victim' means a minor who suffered child sexual abuses offences including the non-consensual dissemination of self-generated material. For the purpose of excercising victim's rights listed in this Regulation, legal representatives shall be considered victims.
Amendment 605 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (wa) ‘hotline’ means an organisation officially recognised by a Member State, other than the reporting channels provided by law enforcement authorities, for receiving anonymous complaints from victims and the public about alleged child sexual abuse;
Amendment 606 #
Proposal for a regulation Article 2 – paragraph 1 – point w a (new) (wa) "online search engine" means an intermediary service as defined in Article 3, point (j), of Regulation (EU) 2022/2065;
Amendment 607 #
Proposal for a regulation Article 2 – paragraph 1 – point w b (new) (wb) 'hotline' means an organization recognized by its Member State of establishment, which provides either a reporting channel provided by law enforcement authorities, or service for receiving anonymous complaints from victims and the public about alleged child sexual abuse online.
Amendment 608 #
Proposal for a regulation Article -3 (new) Article-3 Protection of fundamental human rights and confidentiality in communications 1. Nothing in this Regulation shall prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way. 2. Nothing in this Regulation shall undermine the prohibition of general monitoring under Union law or introduce general data retention obligations.
Amendment 609 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of number-independent interpersonal communications services shall identify, analyse and assess, for each such service that they offer,
Amendment 610 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess
Amendment 611 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of
Amendment 612 #
1. Providers of hosting services and providers of number independent interpersonal communications services shall
Amendment 613 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of use of the service for the purpose of online child sexual abuse
Amendment 614 #
Proposal for a regulation Article 3 – paragraph 1 1. Providers of hosting services and providers of number independent interpersonal communications services shall identify, analyse and assess, for each such service that they offer,
Amendment 615 #
Proposal for a regulation Article 3 – paragraph 1 – subparagraph 1 (new) A hosting service provider or publicly available number-independent interpersonal communication service is exposed to online child sexual abuse where:
Amendment 616 #
Proposal for a regulation Article 3 – paragraph 1 – point a (new) Amendment 617 #
Proposal for a regulation Article 3 – paragraph 1 – point b (new) (b) the provider submitted two or more reports of potential online child sexual abuse in the previous 12 months in accordance with Article 12.
Amendment 618 #
Proposal for a regulation Article 3 – paragraph 1 a (new) 1a. Without prejudice to Regulation (EU) 2022/2065, when conducting the risk assessment, providers of hosting services and providers of interpersonal communications services shall respect and avoid any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity, respect for private and family life, the protection of personal data, freedom of expression and information, including the freedom and pluralism of the media, the prohibition of discrimination, the rights of the child and consumer protection, as enshrined in Articles 1, 7, 8, 11, 21, 24 and 38 of the Charter respectively.
Amendment 619 #
Proposal for a regulation Article 3 – paragraph 1 b (new) 1b. Risk assessment obligations shall always be strictly necessaary and proportionate, and shall never entail a general monitoring obligation, an obligation to seek knowledge about the content of private communications, nor an obligation for providers to seek knowledge of illegal content.
Amendment 620 #
Proposal for a regulation Article 3 – paragraph 2 – point a (a)
Amendment 621 #
Proposal for a regulation Article 3 – paragraph 2 – point a (a)
Amendment 622 #
Proposal for a regulation Article 3 – paragraph 2 – point a (a)
Amendment 623 #
Proposal for a regulation Article 3 – paragraph 2 – point a a (new) (aa) any actual or foreseeable negative effects for the exercise of fundamental rights;
Amendment 624 #
Proposal for a regulation Article 3 – paragraph 2 – point b – introductory part (b) the existence
Amendment 625 #
Proposal for a regulation Article 3 – paragraph 2 – point b – introductory part (b) the existence and implementation by the provider of a policy and the availability of functionalities to
Amendment 626 #
Proposal for a regulation Article 3 – paragraph 2 – point b – introductory part (b) the existence and implementation by the provider of a policy and the availability and effectiveness of functionalities to address the risk referred to in paragraph 1, including through the following:
Amendment 627 #
Proposal for a regulation Article 3 – paragraph 2 – point b – introductory part (b) the existence and implementation by the provider of a policy and the availability of functionalities to prevent and address the risk referred to in paragraph 1, including through the following:
Amendment 628 #
Proposal for a regulation Article 3 – paragraph 2 – point b – introductory part (b) the existence and implementation by the provider of a policy and the availability of functionalities to prevent and address the risk referred to in paragraph 1, including through the following:
Amendment 629 #
Proposal for a regulation Article 3 – paragraph 2 – point b – introductory part (b) the existence and implementation by the provider of a policy and the availability of functionalities to address the systemic risks referred to in paragraph 1, including through the following:
Amendment 630 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 1 Amendment 631 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 1 a (new) - the availability to employ appropriate technical measures - such as parental control tools - to prevent underage access and exposure to inappropriate content or services;
Amendment 632 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 Amendment 633 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 – measures taken to enforce such prohibitions and restrictions and the ammount of human and financial resources dedicated to identify, analyse and assess the presence of child sexual abuse;
Amendment 634 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 2 a (new) Amendment 635 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 Amendment 636 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 Amendment 637 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 Amendment 638 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 Amendment 639 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 – functionalities enabling
Amendment 640 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 – functionalities enabling age verification and subsequent blocking of age-restricted websites and content;
Amendment 641 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 3 – functionalities enabling age verification and parental control;
Amendment 642 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 – functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily accessible and age-appropriate; and capacity to meaningfully deal with those reports in a timely manner;
Amendment 643 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 – functionalities enabling users to flag or notify online child sexual abuse to the provider through tools that are easily accessible and age-appropriate, including already available anonymous reporting channels as provided by Directive (EU) 2019/1937;
Amendment 644 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 – functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily recognisable, accessible and age- appropriate, child- and user-friendly, including anonymous user-reporting channels;
Amendment 645 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 – functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily recognisable, accessible
Amendment 646 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 – functionalities enabling users to flag and report online child sexual abuse to the provider through tools that are easily accessible and age-appropriate with timely response;
Amendment 647 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 – functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily accessible
Amendment 648 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - systems and mechanisms that provide child- and user-friendly ressources to ensure that children can seek help swiftly, including information on how to contact national child protection organisations or national law enforcement
Amendment 649 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - systems and mechanisms that provide child- and user-friendly resources to ensure that children can seek help swiftly, including information on how to contact national child protection organisations or national law enforcement.
Amendment 650 #
- – Functionalities enabling detection for known child sexual abuse material on upload; – Functionalities preventing uploads from the dark web;
Amendment 651 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - functionalities enabling age- appropriate parental controls, including with the use of AI;
Amendment 652 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 a (new) - funcionalities enabling self- reporting by children, their parents or legal guardians.
Amendment 653 #
Proposal for a regulation Article 3 – paragraph 2 – point b – indent 4 b (new) - functionalities enabling self- reporting, including with the use of AI;
Amendment 654 #
Proposal for a regulation Article 3 – paragraph 2 – point c Amendment 655 #
Amendment 656 #
Proposal for a regulation Article 3 – paragraph 2 – point c Amendment 657 #
Proposal for a regulation Article 3 – paragraph 2 – point d Amendment 658 #
Proposal for a regulation Article 3 – paragraph 2 – point d (d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, and the impact thereof on that risk. This is without prejudice to the prohibition on general monitoring nor generalised data retention, and should not be understood as an obligation on providers of relevant information society services to break, weaken or undermine end-to-end encryption or to take other steps that compromise the security, integrity and confidentiality of communications;
Amendment 659 #
Proposal for a regulation Article 3 – paragraph 2 – point d (d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, the design of their recommender systems and any other relevant algorithmic systems and the impact thereof on that risk;
Amendment 660 #
Proposal for a regulation Article 3 – paragraph 2 – point d (d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, whether the service is available directly to end users, and the impact thereof on that risk;
Amendment 661 #
Proposal for a regulation Article 3 – paragraph 2 – point d (d) the manner in which the provider designed and operates the service, including the business model, governance, type of users targeted, and relevant systems and processes, and the impact thereof on that risk;
Amendment 662 #
Amendment 663 #
Proposal for a regulation Article 3 – paragraph 2 – point e – introductory part (e) with respect to the risk of
Amendment 664 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point i (i) the extent to which the service is used or is likely to be used by children, such as an assessment of public surfaces, behavioral signals, the frequency of user reports of online child sexual abuse, and the results of random sampling of content;
Amendment 665 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point i (i) the extent to which the service is
Amendment 666 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point i (i) the extent to which the service is
Amendment 667 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point i (i) the extent to which the service is
Amendment 668 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii Amendment 669 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii (ii) where the service is used or likely to be used by children, the different age groups or likely age groups of the child users and the r
Amendment 670 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii (ii) where the service is
Amendment 671 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point ii (ii) where the service is
Amendment 672 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii Amendment 673 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – introductory part (iii) the availability of functionalities creating or reinforcing the
Amendment 674 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – introductory part (iii) the availability of functionalities creating or reinforcing the serious systemic risk of solicitation of children, including the following functionalities:
Amendment 675 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 1 – enabling users to search for other users, including through search engines external to the service, and, in particular, for adult users to search for child users;
Amendment 676 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 1 – enabling users to search for other users and, in particular, for adult users to search for child users, in particular on services directly targeting children;
Amendment 677 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 1 – enabling users to search for other users
Amendment 678 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 1 – enabling users to search for other users
Amendment 679 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 – enabling users to establish unsolicited contact with other users
Amendment 680 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 – enabling users to establish contact with other users
Amendment 681 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 – enabling users to
Amendment 682 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 2 – enabling users to establish unsolicited contact with other users directly
Amendment 683 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 – enabling users to share images or videos
Amendment 684 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 – enabling users to share unsolicited images or videos with other users, in particular through private communications.
Amendment 685 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 – enabling users to share images or videos with other users
Amendment 686 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 – enabling users to share images or videos
Amendment 687 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 – enabling users to share
Amendment 688 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 a (new) - – Enabling users to create usernames that contain a representation about, or imply, the user’s age; – Enabling child users to create usernames that contain location information on child users; – Enabling users to know or infer the location of child users.
Amendment 689 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii – indent 3 a (new) - The availability for users to search and contact other users based on age or location criteria;
Amendment 690 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii a (new) (iiia) The availability for users to create usernames that imply the user’s age or location.
Amendment 691 #
Proposal for a regulation Article 3 – paragraph 2 – point e – point iii a (new) (iiia) the extent to which children have access to age-restricted content.
Amendment 692 #
Proposal for a regulation Article 3 – paragraph 2 – subparagraph 1 (new) Risk assessment obligations shall never entail a general monitoring obligation, an obligation to seek knowledge about the content of private communications, nor an obligation for providers to seek knowledge of illegal content.
Amendment 693 #
Proposal for a regulation Article 3 – paragraph 2 a (new) 2a. When providers of hosting services and providers of interpersonal communication services put forward age assurance or age verification systems as mitigating measures, they shall meet the following criteria: (a) Protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose; (b) Do not collect data that is not necessary for the purposes of age assurance; (c) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse; (d) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
Amendment 694 #
Proposal for a regulation Article 3 – paragraph 2 a (new) 2a. When providers of hosting services and providers of interpersonal communication services put forward age assurance or age verification system as a mitigation measure, they shell meet the following criteria: a) Protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose; b) Do not collect data that is not necessary for the purpose of age assurance; c) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse; d) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
Amendment 695 #
Proposal for a regulation Article 3 – paragraph 2 a (new) 2a. The provider, where applicable, shall assess, in a separate section of its risk assessment, the voluntary use of specific technologies for the processing of personal and other data to the extent strictly necessary to detect, to report and to remove online child sexual abuse material from its services. Such voluntary use of specific technologies shall under no circumstances undermine the integrity and confidentiality of end-to-end encrypted content and communcations.
Amendment 696 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 The provider may request the EU Centre
Amendment 697 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 The provider may request the EU Centre to perform an analysis of
Amendment 698 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 1 a (new) Neither this request nor its subsequent analysis that the EU Centre may perform shall exempt the provider from its obligation to conduct the risk assessment in accordance with paragraphs 1 and 2 of this Article and to comply with other obligations set out in this Regulation.
Amendment 699 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the
Amendment 700 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 2 The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those
Amendment 701 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 3 Amendment 702 #
Proposal for a regulation Article 3 – paragraph 3 – subparagraph 3 Amendment 703 #
Proposal for a regulation Article 3 – paragraph 3 a (new) 3a. Risk assessment obligations shall always be strictly necessary and proportionate, and shall never entail a general monitoring obligation, an obligation to seek knowledge about the content of private communications, nor an obligation for providers to seek knowledge of illegal content.
Amendment 704 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 1 The provider shall carry out the first risk assessment by [Date of application of this Regulation +
Amendment 705 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 Amendment 706 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – introductory part Subsequently, the provider shall update the risk assessment where necessary and at least once every
Amendment 707 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – introductory part Subsequently, the provider shall update the risk assessment where necessary and at least once every three years from the date at which it last carried out or updated the risk assessment
Amendment 708 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – point a Amendment 709 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – point a (a) for a service which is subject to a detection order issued in accordance with Article 7, the provider shall update the risk
Amendment 710 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – point b Amendment 711 #
Proposal for a regulation Article 3 – paragraph 4 – subparagraph 2 – point b Amendment 712 #
Proposal for a regulation Article 3 – paragraph 5 Amendment 713 #
Proposal for a regulation Article 3 – paragraph 5 Amendment 714 #
Proposal for a regulation Article 3 – paragraph 5 Amendment 715 #
Proposal for a regulation Article 3 – paragraph 5 Amendment 716 #
Proposal for a regulation Article 3 – paragraph 5 Amendment 717 #
Proposal for a regulation Article 3 – paragraph 6 6. The
Amendment 718 #
Proposal for a regulation Article 3 – paragraph 6 6. The Commission, in cooperation with Coordinating Authorities, the European Data Protection Board, the Fundamental Rights Agency and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by
Amendment 719 #
Proposal for a regulation Article 3 – paragraph 6 6. The Commission, in cooperation with Coordinating Authorities, and the EU Centre, after having consulted the European Data Protection Board and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used..
Amendment 720 #
Proposal for a regulation Article 3 – paragraph 6 6. The Commission, in cooperation with Coordinating Authorities, European Data Protection Board, Fundamental Rights Agency and the EU Centre and after having conducted a public consultation, may issue guidelines on the
Amendment 721 #
Proposal for a regulation Article 3 – paragraph 6 6. The Commission, in cooperation with Coordinating Authorities, the European Data Protection Board and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
Amendment 722 #
Proposal for a regulation Article 3 a (new) Amendment 726 #
Proposal for a regulation Article 4 – paragraph -1 (new) -1. Providers of hosting services and providers of interpersonal communications services shall have mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be online child sexual abuse.This obligation shall not be interpreted as an obligation of general monitoring or generalised data retention. Such mechanisms shall be easy to access, child-friendly, and shall allow for the submission of notices by electronic means. [By 6 months after entry into force] the Commission shall adopt a delegated act laying down design requirements for a uniform identifiable notification mechanism as referred to in this Article, including on the design of a uniform, easily recognisable, icon in the user interface. Providers of hosting services and providers of interpersonal communications services targeting children may implement the design requirements specified in the delegated act referred to in this paragraph.
Amendment 727 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number independent interpersonal communications services shall take reasonable mitigation measures, tailored to the systermic risks identified pursuant to Article 3, to minimise that risk. Such measures
Amendment 728 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number-independent interpersonal communications services shall take reasonable mitigation measures, tailored to the significant, systemic, serious risk identified pursuant to Article 3, to minimise that risk. Such targeted measures shall include some or all of the following, where applicable and technically feasible without being detrimental to the technical integrity or operating model of the provider, nor the security, integrity and confidentiality of communications:
Amendment 729 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number independent interpersonal communications services shall take reasonable mitigation measures, tailored to the systemic risks identified pursuant to Article 3, to minimise
Amendment 730 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services
Amendment 731 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall
Amendment 732 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and
Amendment 733 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall take reasonable and proportionate mitigation measures, tailored to the risk identified pursuant to Article 3 and their service, to minimise that risk. Such measures shall include some or all of the following:
Amendment 734 #
Proposal for a regulation Article 4 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to their specific service and the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of
Amendment 735 #
Proposal for a regulation Article 4 – paragraph 1 – point a (a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, including the monitoring tools of phrases and indicators on public surfaces, its decision-
Amendment 736 #
Proposal for a regulation Article 4 – paragraph 1 – point a (a) testing and adapting, through state of the art appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision-
Amendment 737 #
Proposal for a regulation Article 4 – paragraph 1 – point a (a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes
Amendment 738 #
Proposal for a regulation Article 4 – paragraph 1 – point a (a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service,
Amendment 739 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (aa) adapting the design, features and functions of their services in order to ensure a high level of privacy, data protection, safety, and security by design and by default, including some or all of the following: (a) limiting users, by default, to establish direct contact with other users, in particular through private communications; (b) limiting users, by default, to directly share images or videos on services; (c) limiting users, by default, to directly share personal contact details with other users, such as phone numbers, home addresses and e-mail addresses, via rules- based matching; (d) limiting users, by default, to create screenshots or recordings within the service; (e) limiting users, by default, to directly reforward images and videos to other users where no consent has been given; (f) allowing parents of a child or a legal representative of a child to make use of meaningful parental controls tools, which protect the confidentiallity of communications of the child; (g) encouraging children, prior to registring for the service, to talk to their parents about how the service works and what parental controls tools are available. Services taking the measures outlined in this point may allow users to revert such measures on an individual level.
Amendment 740 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (aa) adapting the design, features and functions of their service in order to ensure the highest level of privacy, safety and security by design and by default, in particular, to protect children;
Amendment 741 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (aa) Designing educational and awareness-raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, including child-appropriate information;
Amendment 742 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (aa) Designing educational and awarness-raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, including child-appropriate information;
Amendment 743 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (aa) providing easily accessible and user-friendly mechanisms for users to report or flag to the provider alleged online child sexual abuse;
Amendment 744 #
Proposal for a regulation Article 4 – paragraph 1 – point a a (new) (aa) providing security by design, as a way to ensuring services that are safe and secure, especially for children;
Amendment 745 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (ab) providing technical measures and tools that allow users to manage their own privacy, visibility, reachability and safety , and that are set to the most private and secure levels by default;
Amendment 746 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (ab) emplying appropriate age measurments - such as parental control tools, to prevent underage access and exposure to inappropriate content or services;
Amendment 747 #
Proposal for a regulation Article 4 – paragraph 1 – point a b (new) (ab) providing several reporting functions within their services, so that users of the services can report and flag content and material;
Amendment 748 #
Proposal for a regulation Article 4 – paragraph 1 – point a c (new) (ac) ask for user confirmation before allowing an unknown user to communicate and before displaying their communications;
Amendment 749 #
Proposal for a regulation Article 4 – paragraph 1 – point a d (new) (ad) optionally or by default ask for user confirmation and offer guidance before displaying or sharing certain content such as nudity where the provider ensures that no indication of the process and the content leaves the user’s device and the user is reassured of this;
Amendment 750 #
Proposal for a regulation Article 4 – paragraph 1 – point a e (new) (ae) providing tools in a prominent way on their platform that allow users to seek help from their local help-line;
Amendment 751 #
Proposal for a regulation Article 4 – paragraph 1 – point a f (new) (af) informing and reminding users and non-users, such as parents, at point of need on what constitutes online child sexual abuse and what is typical offender behaviour; offering advice on safe behaviour and the consequences of illegal behaviour in a visible, easy to find and easy to understand way;
Amendment 752 #
Proposal for a regulation Article 4 – paragraph 1 – point a g (new) Amendment 753 #
Proposal for a regulation Article 4 – paragraph 1 – point a h (new) (ah) human moderation of publicly accessible chats, based on random checks, and human moderation of publicly accessible, specific channels at high risk of online child sexual abuse;
Amendment 754 #
Proposal for a regulation Article 4 – paragraph 1 – point a i (new) (ai) providing readily accessible mechanisms for users to block or mute other users;
Amendment 755 #
Proposal for a regulation Article 4 – paragraph 1 – point a j (new) (aj) displaying warnings and advice to users at risk of offending or victimisation where the provider ensures that no indication of the process and the content leaves the user's device and the user is reassured of this;
Amendment 756 #
Proposal for a regulation Article 4 – paragraph 1 – point a k (new) (ak) informing parents on the nature of the service and the functionalities offered as well as on how to report or flag to the provider alleged online child sexual abuse;
Amendment 757 #
Proposal for a regulation Article 4 – paragraph 1 – point a l (new) (al) any other mechanisms to increase the awareness of online child sexual abuse on its services;
Amendment 758 #
Proposal for a regulation Article 4 – paragraph 1 – point b (b) reinforcing the provider’s internal processes or the internal supervision of the functioning of the service, user testing and feedback collection;
Amendment 759 #
Proposal for a regulation Article 4 – paragraph 1 – point b a (new) (ba) Implementing and constantly innovating functionalities and protocols to prevent and reduce the risk of online child sexual abuse, and regularly assessing their effectiveness in light of the latest technological developments and trends in the dissemination and monetization of child sexual abuse material;
Amendment 760 #
Proposal for a regulation Article 4 – paragraph 1 – point b b (new) (bb) the use of specific technologies on a voluntary basis for the sole purpose of preventing and detecting online child sexual abuse in accordance with Article 4a
Amendment 761 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of number-independent interpersonal communication services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
Amendment 762 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of number-independent interpersonal communication services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
Amendment 763 #
Proposal for a regulation Article 4 – paragraph 1 – point c (c) initiating or adjusting cooperation, in accordance with competition law, with other providers of
Amendment 764 #
Proposal for a regulation Article 4 – paragraph 1 – point c – point 1 (new) 1) introducing a clear and easily- identifiable icon for the immediate and efficacious reporting of content deemed inappropriate under Article 1 of this Regulation.
Amendment 765 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (ca) Setting up specific prevention measures to highlight risks related to the use of their service. Such communication shall be targeted to both minor users through child friendly means and parents.
Amendment 766 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (ca) enabling users to flag or notify online child sexual abuse to the provider through tools that are easily accessible and age-appropriate, including already anonymous reporting channels;
Amendment 767 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (ca) reinforcing awareness-raising measures and adapting their online interface for increased user information, including child-appropriate information targeted to the risk identified;
Amendment 768 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (ca) adapting the design, features and functions of their services in order to ensure a high level of privacy, safety, and security and data protection by design and by default
Amendment 769 #
Proposal for a regulation Article 4 – paragraph 1 – point c a (new) (ca) processing metadata, in accordance with Article 4a
Amendment 770 #
Proposal for a regulation Article 4 – paragraph 1 – point c b (new) (cb) setting up specific reporting mechanism, child friendly and easily accessibile. Such tools should be visible and easily accessible by the user from the direct comunication webpage.
Amendment 771 #
Proposal for a regulation Article 4 – paragraph 1 – point c b (new) (cb) enabling safe self-reporting capabilities for children, their parents or legal guardians.
Amendment 772 #
Proposal for a regulation Article 4 – paragraph 1 – point c b (new) (cb) including clearly visible and identifiable information on the minimum age for using the service;
Amendment 773 #
Proposal for a regulation Article 4 – paragraph 1 – point c c (new) (cc) initiating targeted measures to protect the rights of the child and tools aimed at helping users to indicate child sexual abuse material and helping children to signal abuse or obtain support;
Amendment 774 #
Proposal for a regulation Article 4 – paragraph 1 – point c c (new) (cc) Setting up mechanisms to raise awareness among adult users to warn about potential violations of this Regulation.
Amendment 775 #
1a. Providers of hosting services and providers of interpersonal communications services shall continue the voluntary use of specific technologies, as mitigation measures, for the processing of personal and other data to the extent strictly necessary to detect, report and remove online child sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment conducted or updated in accordance with Article 3 and prior authorization from the Coordinating Authority;
Amendment 776 #
Proposal for a regulation Article 4 – paragraph 1 a (new) 1a. Providers of hosting services and providers of interpersonal communications services shall continue the voluntary use of specific technologies, as mitigation measures, for the processing of personal and other data to the extent strictly necessary to detect, report and remove online sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment conducted or updated in accordance with Article 3 and prior authorization from the Coordinating Authority;
Amendment 777 #
Proposal for a regulation Article 4 – paragraph 1 a (new) 1a. Providers of hosting services and providers of interpersonal communications services directly targeting children shall implement the design requirements as specified in the delegated act referred to in paragraph -1 and shall take all mitigation measures as outlined in paragraph 1, point (aa), of this Article to minimise this risk. Such services shall allow users to revert mitigation measures on an individual level.
Amendment 778 #
Proposal for a regulation Article 4 – paragraph 1 a (new) Amendment 779 #
Proposal for a regulation Article 4 – paragraph 1 b (new) 1b. The Coordinating Authority shall decide whether to proceed according to paragraph 1a no later than three months from the provider’s request.
Amendment 780 #
Proposal for a regulation Article 4 – paragraph 1 b (new) 1b. The Coordinating Authority shall decide whether to proceed according to paragraph 1a no later than three months from the provider´s request.
Amendment 781 #
Proposal for a regulation Article 4 – paragraph 2 – introductory part 2. The
Amendment 782 #
Proposal for a regulation Article 4 – paragraph 2 – point a (a)
Amendment 783 #
Proposal for a regulation Article 4 – paragraph 2 – point a (a) effective in mitigating the identified significant, systemic, and serious risk;
Amendment 784 #
(a) effective in mitigating the identified serious systemic risk;
Amendment 785 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk, ensuring that the interference with the fundamental right to privacy and the other rights laid down in the Charter is limited to what is strictly necessary as well as the provider’s financial and technological capabilities and the number of users
Amendment 786 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk
Amendment 787 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) they shall be targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk
Amendment 788 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that serious systemic risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technological
Amendment 789 #
Proposal for a regulation Article 4 – paragraph 2 – point b (b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk as well as
Amendment 790 #
Proposal for a regulation Article 4 – paragraph 2 – point c (c) applied in a diligent and non- discriminatory manner, with full assessment, having due regard, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected and in particular of the rights to privacy, data protection and freedom of expression, and for the protection of the integrity and security of platforms and services, including those that are end-to-end encrypted;
Amendment 791 #
Proposal for a regulation Article 4 – paragraph 2 – point c (c) applied in a diligent and non- discriminatory manner,
Amendment 792 #
Proposal for a regulation Article 4 – paragraph 2 – point c (c) applied in a diligent and non- discriminatory manner, having due regard, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected
Amendment 793 #
(c) applied in a diligent and non- discriminatory manner, having due regard with full respect, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected ;
Amendment 794 #
Proposal for a regulation Article 4 – paragraph 2 – point c (c) they shall be applied in a diligent and non-
Amendment 795 #
Proposal for a regulation Article 4 – paragraph 2 – point c a (new) (ca) done in a way that does not compromise end-to-end encryption;
Amendment 796 #
Proposal for a regulation Article 4 – paragraph 2 – point d (d) introduced, reviewed, discontinued or expanded, as appropriate, each time the risk assessment is conducted or updated pursuant to Article 3(4), as soon as possible and in any case within
Amendment 797 #
Proposal for a regulation Article 4 – paragraph 2 – point d (d) they shall be introduced, reviewed, discontinued or expanded, as appropriate, each time the risk assessment is conducted or updated pursuant to Article 3(4), within three months from the date referred to therein.
Amendment 798 #
Proposal for a regulation Article 4 – paragraph 2 – point d a (new) Amendment 799 #
Proposal for a regulation Article 4 – paragraph 2 a (new) 2a. If the risk assessment conducted or updated in accordance with Article 3 identifies that there is a risk of use of the service being used to disseminate, store or make available verified child sexual abuse material, reasonable mitigation measures may include voluntary measures to detect and remove such material in accordance with Article 4, (a).
Amendment 800 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 801 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 802 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 803 #
Proposal for a regulation Article 4 – paragraph 3 Amendment 804 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary and proportionate age verification and age assessment measures to reliably
Amendment 805 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children,
Amendment 806 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures and to put in place effective measures to block the access of children to websites that fall under an age- restriction applicable under national law.
Amendment 807 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take
Amendment 808 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary a
Amendment 809 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary
Amendment 810 #
Proposal for a regulation Article 4 – paragraph 3 3. Providers of number independent interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a systemic risk of use of their services for the purpose of the solicitation of children,
Amendment 811 #
Proposal for a regulation Article 4 – paragraph 3 a (new) 3a. Any requirement to take specific measures shall be without prejudice to Article 8 of Regulation (EU) 2022/2065 [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] and shall entail neither a general obligation for hosting services providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity. Any requirement to take specific measures shall not include an obligation to use ex-ante control measures based on automated tools or upload-filtering of information, to interfere with the secrecy of communications or to restrict the possibility to use a service anonymously.
Amendment 812 #
Proposal for a regulation Article 4 – paragraph 3 a (new) 3a. Risk mitigation measures shall always be strictly necessary and proportionate, and shall never entail a general monitoring obligation, an obligation to seek knowledge about the content of private communications, contrary to Article 5 of the ePrivacy Directive, nor an obligation for providers to seek knowledge of illegal content.
Amendment 813 #
Proposal for a regulation Article 4 – paragraph 3 b (new) 3b. Nothing in this regulation shall be construed as prohibiting, restricting, circumventing or undermining the provision or the use of encrypted services.
Amendment 814 #
Proposal for a regulation Article 4 – paragraph 4 4. Providers of hosting services and providers of interpersonal communications services shall clearly describe in their terms and conditions the mitigation measures that they have taken.
Amendment 815 #
Proposal for a regulation Article 4 – paragraph 4 4.
Amendment 816 #
Proposal for a regulation Article 4 – paragraph 4 4. Providers of hosting services and providers of number-independent interpersonal communications services shall clearly describe in their terms and conditions the mitigation measures that they have taken. That description shall not include information that may reduce the effectiveness of the mitigation measures.
Amendment 817 #
Proposal for a regulation Article 4 – paragraph 4 4. Providers of hosting services and providers of number independent interpersonal communications services shall clearly describe in their terms and conditions the mitigation
Amendment 818 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities, European Data Protection Board, Fundamental Rights Agency and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used.
Amendment 819 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities
Amendment 820 #
Proposal for a regulation Article 4 – paragraph 5 5. The
Amendment 821 #
Proposal for a regulation Article 4 – paragraph 5 5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1
Amendment 822 #
Proposal for a regulation Article 4 – paragraph 5 a (new) 5a. The European Data Protection Board (EDPB) shall issue guidelines regarding the compliance with the General Data Protection Regulation of existing and future technologies that are used for the detection of child sexual abuse material in encrypted and non- encrypted environments. Data Protection Authorities shall be in charge of the supervision of the application of the EDPB guidelines and they shall assess any technologies currently used or that will be used to scan the content of communications with the aim of detecting CSAM or any other type of content in light of the Regulation (EU) 2016/679 (General Data Protection Regulation) and the Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications).
Amendment 823 #
Proposal for a regulation Article 4 – paragraph 5 a (new) 5a. Prior to the deployment of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
Amendment 824 #
Proposal for a regulation Article 4 – paragraph 5 b (new) 5b. Where the mitigating measures by a provider in accordance with Paragraph 1 prove to be ineffective or insufficient, the Coordinating Authority shall have the power to order the provider to comply with this Article, including by ordering the provider to take specific mitigating measures in accordance with this Article.
Amendment 825 #
Proposal for a regulation Article 4 a (new) Amendment 826 #
Proposal for a regulation Article 4 a (new) Article4a Legal basis for the risk mitigation through metadata processing 1. On the basis of the risk assessment submitted and, where applicable, further information, the Coordinating Authority of establishment shall have the power to authorise or require a provider of hosting services or a provider of interpersonal communications services to process metadata to the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, as a mitigation measure in accordance with Article 4. When assessing whether to request the processing of metadata, the Coordinating Authority shall take into account any interference with the rights to privacy and data protection of the users of the service that such a processing entails and determine whether, in that case, the processing of metadata would be effective in mitigating the risk of use of the service for the purpose of child sexual abuse, and that it is strictly necessary and proportionate. 2. If they process metadata as a risk mitigation measure, providers shall inform their users of such processing in their terms and conditions, including information on the possibility to submit complaints to the competent DPA concerning the relevant processing, in accordance with Regulation (EU) 2016/679, and on the avenues for judicial redress.
Amendment 827 #
Proposal for a regulation Article 4 a (new) Article4a Specific measures for platforms primarily used for the dissemination of pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure a. user-friendly reporting mechanisms to report alleged child sexual abuse material; b. adequate professional human content moderation to rapidly process notices of alleged child sexual abuse material; c. automatic mechanisms and interface design elements to inform users about external resources in the user’s region on preventing child sexual abuse, counselling by specialist helplines, victim support and educational resources by hotlines and child protection organisations; d. automatic detection of searches for child sexual abuse material, warning and advice alerts displayed to users doing such searches, and flagging of the search and the user for human moderation;
Amendment 828 #
Article4b Specific measures for number- independent interpersonal communications service within games Providers of online games that operate number-independent interpersonal communications service within their games, and which are exposed to a substantial amount of online child sexual abuse, shall take all of the following specific measures in addition to the requirements referred to Article 4: 1. prevent users from initiating unsolicited contact with other users; 2. facilitate user-friendly reporting of alleged child sexual abuse material; 3. provide technical measures and tools that allow users to manage their own privacy, visibility, reachability and safety and that are set to the most private and secure levels by default; 4. provide tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line.
Amendment 829 #
Proposal for a regulation Article 5 Amendment 831 #
Proposal for a regulation Article 5 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number-independent interpersonal communications services to which Article 3 applies shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 832 #
Proposal for a regulation Article 5 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number independent interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 833 #
Proposal for a regulation Article 5 – paragraph 1 – introductory part 1. Providers of hosting services and providers of number-independent interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 834 #
Proposal for a regulation Article 5 – paragraph 1 – introductory part 1. Providers of hosting services and providers of interpersonal communications services shall transmit, by
Amendment 835 #
Proposal for a regulation Article 5 – paragraph 1 – point a (a) the process and the results of the risk assessment conducted or updated pursuant to Article 3
Amendment 836 #
Proposal for a regulation Article 5 – paragraph 1 – point a (a) the process and the results of the risk assessment conducted or updated
Amendment 837 #
Proposal for a regulation Article 5 – paragraph 1 – point a (a)
Amendment 838 #
Proposal for a regulation Article 5 – paragraph 1 – point b (b) any mitigation measures taken and those that require prior authorization pursuant to Article 4.
Amendment 839 #
Proposal for a regulation Article 5 – paragraph 1 – point b (b) any
Amendment 840 #
Proposal for a regulation Article 5 – paragraph 2 2. Within three months after receiving the report, the Coordinating Authority of establishment shall assess it and determine, on that basis and taking into account any other relevant information available to it, whether the risk assessment has been carried out or updated and the
Amendment 841 #
Proposal for a regulation Article 5 – paragraph 2 2. Within
Amendment 842 #
Proposal for a regulation Article 5 – paragraph 3 – subparagraph 1 Where necessary for that assessment, that Coordinating Authority may require further information from the provider,
Amendment 843 #
Proposal for a regulation Article 5 – paragraph 3 – subparagraph 2 Amendment 844 #
Proposal for a regulation Article 5 – paragraph 4 4. Without prejudice to Articles 7 and 27 to 29, where the requirements of Articles 3 and 4 have not been met, that Coordinating Authority shall require the provider to
Amendment 845 #
Proposal for a regulation Article 5 – paragraph 4 – point a (new) (a) Where the Coordinating Authority considers that the mitigation measures taken do not comply with Article 4, it shall address a decision to the provider requiring it to take the necessary measures so as to ensure that Article 4 is complied with.
Amendment 846 #
Proposal for a regulation Article 5 – paragraph 4 a (new) 4a. The provider may, at any time, request the competent Coordinating authority to review and, where appropriate, amend or revoke a decision as referred to in paragraph 4. The authority shall, within three months of receipt of the request, adopt a reasoned decision on the request based on objective factors and notify the provider of that decision.
Amendment 847 #
Proposal for a regulation Article 5 – paragraph 6 Amendment 848 #
Proposal for a regulation Article 5 – paragraph 6 Amendment 849 #
Proposal for a regulation Article 5 – paragraph 6 Amendment 850 #
Proposal for a regulation Article 5 – paragraph 6 – subparagraph 1 (new) Service providers, the EU Centre and all European and national authorities managing the personal data of children or adults are required to comply with the GDPR.
Amendment 851 #
Proposal for a regulation Article 6 Amendment 852 #
Proposal for a regulation Article 6 Amendment 853 #
Proposal for a regulation Article 6 Amendment 854 #
Proposal for a regulation Article 6 Amendment 855 #
Proposal for a regulation Article 6 Amendment 856 #
Proposal for a regulation Article 6 Amendment 857 #
Proposal for a regulation Article 6 Amendment 858 #
Proposal for a regulation Article 6 – paragraph 1 – introductory part 1. Providers of software application stores considered as gatekeepers under the Digital Markets Act (EU) 2022/1925 shall:
Amendment 859 #
Proposal for a regulation Article 6 – paragraph 1 – point a (a)
Amendment 860 #
Proposal for a regulation Article 6 – paragraph 1 – point b (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the
Amendment 861 #
Proposal for a regulation Article 6 – paragraph 1 – point b (b)
Amendment 862 #
Proposal for a regulation Article 6 – paragraph 1 – point b (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the
Amendment 863 #
Proposal for a regulation Article 6 – paragraph 1 – point b (b)
Amendment 864 #
Proposal for a regulation Article 6 – paragraph 1 – point b – point i (new) i) the developer of the software application has decided and informed the software application store that its terms and conditions of use do not permit child users,
Amendment 865 #
Proposal for a regulation Article 6 – paragraph 1 – point b – point ii (new) ii) the software application has an appropriate age rating model in place, or
Amendment 866 #
Proposal for a regulation Article 6 – paragraph 1 – point b – point iii (new) iii) the developer of the software application has requested the software application store not to allow child users to download its software applications.
Amendment 867 #
Proposal for a regulation Article 6 – paragraph 1 – point c Amendment 868 #
Proposal for a regulation Article 6 – paragraph 1 – point c Amendment 869 #
Proposal for a regulation Article 6 – paragraph 1 – point c a (new) (ca) indicate, based on the information provided by the applications developers, the minimum age for using an application, as set out in the terms and conditions of the provider of the application;
Amendment 870 #
Proposal for a regulation Article 6 – paragraph 1 a (new) 1a. Providers of software applications who have been informed that in relation to their software applications a significant risk of use of the service concerned for the purpose of the solicitation of children has been identified, shall take reasonable and proportionate mitigation measures.
Amendment 871 #
Proposal for a regulation Article 6 – paragraph 2 Amendment 872 #
Proposal for a regulation Article 6 – paragraph 3 Amendment 873 #
Proposal for a regulation Article 6 – paragraph 4 4. The Commission, in cooperation with Coordinating Authorities
Amendment 874 #
Article6a Encrypted services and metadata processing 1. Nothing in this Regulation shall be interpreted as prohibiting or weakening end-to-end encryption. 2. On the basis of the risk assessment submitted and, where applicable, further information, the Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to authorise a provider of hosting services or a provider of interpersonal communications services to process metadata to the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse. When assessing whether to request the processing of metadata, the Coordinating Authority shall take into account any interference with the rights to privacy and data protection of the users of the service that such a processing entails and determine whether, in that case, the processing of metadata would be effective in mitigating the risk of use of the service for the purpose of child sexual abuse, and that it is strictly necessary and proportionate. 3. Without prejudice to Regulation (EU) 2016/679, providers shall inform the users of such processing in their terms and conditions, including information on the possibility to submit complaints to the competent data processing authorities concerning the relevant processing and on the avenues for judicial redress.
Amendment 875 #
Proposal for a regulation Article 6 a (new) Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communcations shall be understood the processing of any data that would compromise or put at risk the integrity and confidentiality of the content and communications in the end- to-end encryption. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communications services provides third party actors access to the end-to-end encrypted content.
Amendment 876 #
Proposal for a regulation Article 6 a (new) Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting, weakening or compromising the integrity and confidentiality of end-to-end encrypted content and communications. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provides third party actors access to end-to-end encrypted content. No provider of a hosting service or provider of interpersonal communication services shall be compelled to enable or create access to communcations by means of bypassing user authentication or encryption under the scope of this regulation.
Amendment 877 #
Proposal for a regulation Article 6 a (new) Article6a Encrypted services and metadata processing 1. Nothing in this Regulation shall be interpreted as prohibiting or weakening end-to-end encryption.
Amendment 878 #
Proposal for a regulation Article 6 b (new) Article6b Software application stores 1.Providers of software application stores shall: (a) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;
Amendment 885 #
Proposal for a regulation Article 7 – paragraph 1 Amendment 886 #
Proposal for a regulation Article 7 – paragraph 1 1.
Amendment 887 #
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 888 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 889 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 890 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 891 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request
Amendment 892 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 893 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it
Amendment 894 #
Proposal for a regulation Article 7 – paragraph 1 1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of
Amendment 895 #
Proposal for a regulation Article 7 – paragraph 1 a (new) 1a. The Coordinating Authority of establishment shall have the power to authorise the provider the voluntary use of specific technologies for the processing of personal data and other data to the extent strictly necessary to detect, report and remove online child sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, following a risk assessment performed by the provider pursuant to Article 3 of this Regulation. It shall have the power to define the terms of authorisation for the provider to take measures specified in Article 10 to detect online child sexual abuse on a specific service.
Amendment 896 #
Proposal for a regulation Article 7 – paragraph 1 a (new) 1a. Detection orders shall only target providers of hosting services or providers of number independent interpersonal communications services that fail to comply with the requirements outlined in articles 3, 4 and 5 of this Regulation. They shall only be issued once all the measures in the abovementioned articles have been exhausted and target providers that can reasonably be expected to have the technical and operational ability to act.
Amendment 897 #
Proposal for a regulation Article 7 – paragraph 1 a (new) 1a. Interpersonal communications to which end to end encryption is, has been or will be applied, shall not be subject to the measures specified in Article 10.
Amendment 898 #
Proposal for a regulation Article 7 – paragraph 1 a (new) 1a. Such a detection order shall as far as possible be restricted and specified, not calling for mass detection through the whole services.
Amendment 899 #
Proposal for a regulation Article 7 – paragraph 2 Amendment 900 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The request of the Coordinating Authority of establishment
Amendment 901 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The Coordinating Authority of establishment shall, before requesting the issuance of a detection
Amendment 902 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 1 The Coordinating Authority of establishment shall, before requesting the issuance of a detection order, carry out the
Amendment 903 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 2 To that end, it may, where appropriate, require the provider to submit
Amendment 904 #
Proposal for a regulation Article 7 – paragraph 2 – subparagraph 2 To that end, it may, where appropriate, require the provider to submit the necessary information, additional to the
Amendment 905 #
Proposal for a regulation Article 7 – paragraph 2 a (new) 2a. The grounds for issuing the order shall outweight the negative consequences for the rights and legitimate iterests of all the parties concerned, having regard in particular to the need to endure a fair balance between the fundamental rights of those parties. The order shall be a measure of last resort and shall be issued on the basis of a case-by-case analysis.
Amendment 908 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – introductory part Where the Coordinating Authority of establishment takes the
Amendment 909 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point a (a) establish a draft request to the competent judicial authority of the Member State that designated it for the issuance of a detection
Amendment 910 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point a (a) establish a draft request to the competent judicial authority of the Member State that designated it for the issuance of a detection
Amendment 911 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point a (a) establish a draft request for the issuance of a detection order, specifying the factual and legal grounds upon which the request is based, the main elements of the content of the detection order it intends to request and the reasons for requesting it;
Amendment 912 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point a (a) establish a draft request for the issuance of a detection order, specifying targeted suspects and or activities, the main elements of the content of the detection order it intends to request and the reasons for requesting it;
Amendment 913 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point b Amendment 914 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point b (b) submit the draft request to the
Amendment 915 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point c Amendment 916 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point c Amendment 917 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point c Amendment 918 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point d Amendment 919 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point d Amendment 920 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point d (d) invite the EU Centre to provide its opinion on the draft request, within a time period of
Amendment 921 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 1 – point d a (new) (da) Request the supervisory authorities designated pursuant to Chapter VI, Section 1, of Regulation (EU) 2016/678 to perform their tasks within the competence pursuant to Chapter VI, Section 2 of Regulation (EU) 2016/678 and provide thei opinion on the draft request, within a reasonable time period set by that Coordinating Authority;
Amendment 922 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 Amendment 923 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 Amendment 924 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – introductory part Where, having regard to the
Amendment 925 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – introductory part Where, having regard to the comments of the provider and the opinion of the EU Centre, and in particular taking into account the assessment of the EU Centre´s Technical Committee as referred to in Article 66(6)(a NEW), that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall re-submit the draft request, adjusted where appropriate, to the provider. In that case, the provider shall do all of the following, within a reasonable time period set by that Coordinating Authority:
Amendment 926 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – introductory part Where, having regard to the comments of the
Amendment 927 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a Amendment 928 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the measures it envisages taking to execute the intended detection order, including detailed information regarding the envisaged technologies and safeguards; the implementation plan shall explicitly set out the specific measures that the provider intends to take to counter act potential security risk that might be linked to the execution of the detection order on its services. The provider may consult the EU Centre, and in particular its Technology Committee, to obtain support in identifying appropriate measures in this respect;
Amendment 929 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the specific person or persons the authority intends to investigate, the measures it envisages taking to execute the intended detection order, including detailed information regarding the envisaged technologies and safeguards;
Amendment 930 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point a (a) draft an implementation plan setting out the measures it envisages taking to execute the intended detection
Amendment 931 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b Amendment 932 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b)
Amendment 933 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b) where the draft implementation plan concerns an intended detection order concerning the
Amendment 934 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b) where the draft implementation plan concerns an intended detection order concerning the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct a data protection impact assessment, a child rights impact assessment of child sexual abuse risks and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation
Amendment 935 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point b (b) where the draft implementation plan concerns an intended detection order concerning new child sexual abuse material and the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct a data protection impact assessment and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation plan;
Amendment 936 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c Amendment 937 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c)
Amendment 938 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessment, child rights impact assessment of child sexual abuse risks and in order to take into account the opinion of the data protection authority provided in
Amendment 939 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point c (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary, in view of the outcome of the data protection impact assessment and in order to take
Amendment 940 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point d Amendment 941 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point d (d) submit to that
Amendment 942 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 2 – point d (d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted
Amendment 943 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Amendment 944 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the
Amendment 945 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the
Amendment 946 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the implementation plan of the provider and the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the issuance of the detection, adjusted
Amendment 947 #
Proposal for a regulation Article 7 – paragraph 3 – subparagraph 3 Where, having regard to the implementation plan of the provider and having utmost regard to the opinion of the data protection authority, that Coordinating Authority
Amendment 948 #
Proposal for a regulation Article 7 – paragraph 4 Amendment 949 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 Amendment 950 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority
Amendment 951 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority
Amendment 952 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part The Coordinating Authority of establishment shall request the issuance of the detection
Amendment 953 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part The Coordinating Authority of establishment shall request the issuance of the targeted detection order, and the competent judicial authority
Amendment 954 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – introductory part Amendment 955 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a Amendment 956 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a Amendment 957 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is clear evidence of a s
Amendment 958 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is evidence
Amendment 959 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is
Amendment 960 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is
Amendment 961 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a (a) there is evidence of
Amendment 962 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point a a (new) (aa) the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;
Amendment 963 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b Amendment 964 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b Amendment 965 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b (b)
Amendment 966 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b (b) the reasons for issuing the detection
Amendment 967 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b a (new) (ba) the provider has failed to take all reasonable and proportionate mitigation measures within the meaning of Article 4 to prevent and minimise the risk of the service being used for the purpose of online child sexual abuse;
Amendment 968 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b a (new) (ba) The detection warrant does not affect the security and confidentiality of communications on a general scale.
Amendment 969 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b a (new) (ba) The detection order does not affect the security and confidentiality of communications on a general scale;
Amendment 970 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b b (new) (bb) The technology used to protect the communication, such as any kind of encryption, shall not be affected or undermined by the detection warrant.
Amendment 971 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b b (new) (bb) The technology used to protect the communication, such as any kind of encryption, shall not be affected or undermined by the detection order;
Amendment 972 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b c (new) (bc) All measures outlined in articles 3, 4 and 5 have been exhausted.
Amendment 973 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 1 – point b d (new) (bd) Nothing in the order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider.
Amendment 974 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 Amendment 975 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 Amendment 976 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 Amendment 977 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point -a (new) (-a) the availability of information to adequately describe the specific purpose and scope of the order, including the legal basis for the suspicion;
Amendment 978 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point a Amendment 979 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point a a (new) (aa) whether or not the prosecution or judge would have sufficient information to issue the warrant with instructions describing the specific purpose and scope regarding the envisaged technologies to execute the warrant, including the basis upon which the individuals concerned are suspects within the meaning of Union or national law;
Amendment 980 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point b Amendment 981 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point b (b) any additional information obtained pursuant to paragraph 2 or any other relevant information available to it, in particular regarding the use, design and operation of the service, regarding the provider’s financial and technological capabilities and size and regarding the potential consequences of the measures to be taken to execute the detection
Amendment 982 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point c Amendment 983 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point c Amendment 984 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point c (c) the views, including on the technical feasibility, and the implementation plan of the provider submitted in accordance with paragraph 3;
Amendment 985 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point d Amendment 986 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point d (d) the opinions of the
Amendment 987 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 2 – point d (d) the opinion
Amendment 988 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 3 Amendment 989 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 3 Amendment 990 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 3 As regards the second subparagraph, point (d), where that Coordinating Authority substantially deviates from the opinion of the
Amendment 991 #
Proposal for a regulation Article 7 – paragraph 4 – subparagraph 3 Amendment 992 #
Proposal for a regulation Article 7 – paragraph 5 Amendment 993 #
Proposal for a regulation Article 7 – paragraph 5 Amendment 994 #
Proposal for a regulation Article 7 – paragraph 5 Amendment 995 #
Proposal for a regulation Article 7 – paragraph 5 Amendment 996 #
Proposal for a regulation Article 7 – paragraph 5 Amendment 997 #
Proposal for a regulation Article 7 – paragraph 5 Amendment 998 #
Proposal for a regulation Article 7 – paragraph 5 – point a Amendment 999 #
Proposal for a regulation Article 7 – paragraph 5 – point b source: 749.192
|
History
(these mark the time of scraping, not the official date of the change)
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Old
Rules of Procedure EP 159New
Rules of Procedure EP 165 |
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/legal_basis/0 |
Rules of Procedure EP 57_o
|
procedure/legal_basis/0 |
Rules of Procedure EP 57
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Old
Rules of Procedure EP 159New
Rules of Procedure EP 165 |
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/legal_basis/0 |
Rules of Procedure EP 57_o
|
procedure/legal_basis/0 |
Rules of Procedure EP 57
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Old
Rules of Procedure EP 159New
Rules of Procedure EP 165 |
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/legal_basis/0 |
Rules of Procedure EP 57_o
|
procedure/legal_basis/0 |
Rules of Procedure EP 57
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Old
Rules of Procedure EP 159New
Rules of Procedure EP 165 |
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/legal_basis/0 |
Rules of Procedure EP 57_o
|
procedure/legal_basis/0 |
Rules of Procedure EP 57
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Old
Rules of Procedure EP 159New
Rules of Procedure EP 165 |
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/legal_basis/0 |
Rules of Procedure EP 57_o
|
procedure/legal_basis/0 |
Rules of Procedure EP 57
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Old
Rules of Procedure EP 159New
Rules of Procedure EP 165 |
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/legal_basis/0 |
Rules of Procedure EP 57_o
|
procedure/legal_basis/0 |
Rules of Procedure EP 57
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Old
Rules of Procedure EP 159New
Rules of Procedure EP 165 |
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/legal_basis/0 |
Rules of Procedure EP 57_o
|
procedure/legal_basis/0 |
Rules of Procedure EP 57
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Old
Rules of Procedure EP 159New
Rules of Procedure EP 165 |
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/stage_reached |
Old
Awaiting Parliament's position in 1st readingNew
Awaiting committee decision |
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/stage_reached |
Old
Awaiting Parliament's position in 1st readingNew
Awaiting committee decision |
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/1/rapporteur |
|
committees/2 |
|
committees/2 |
|
committees/2/rapporteur |
|
committees/3 |
|
committees/3 |
|
committees/3/rapporteur |
|
committees/4 |
|
committees/4 |
|
committees/4/rapporteur |
|
committees/5 |
|
committees/6 |
|
committees/7 |
|
committees/8 |
|
committees/9 |
|
docs/3 |
|
docs/4 |
|
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/9 |
|
docs/10 |
|
docs/11 |
|
docs/12 |
|
events/3 |
|
events/4 |
|
procedure/dossier_of_the_committee/0 |
LIBE/10/00194
|
procedure/dossier_of_the_committee/0 |
LIBE/9/09061
|
procedure/stage_reached |
Old
Awaiting Parliament's position in 1st readingNew
Awaiting committee decision |
docs/13 |
|
events/5/summary |
|
events/7 |
|
events/6 |
|
docs/13 |
|
events/5 |
|
procedure/stage_reached |
Old
Awaiting committee decisionNew
Awaiting Parliament's position in 1st reading |
events/3 |
|
events/4 |
|
procedure/Other legal basis |
Rules of Procedure EP 159
|
forecasts |
|
docs/14 |
|
docs/13/date |
Old
2023-05-14T00:00:00New
2023-05-15T00:00:00 |
docs/14/date |
Old
2022-09-11T00:00:00New
2022-09-12T00:00:00 |
docs/15/date |
Old
2023-02-21T00:00:00New
2023-02-22T00:00:00 |
docs/16/date |
Old
2022-11-14T00:00:00New
2022-11-15T00:00:00 |
docs/17/date |
Old
2023-07-17T00:00:00New
2023-07-18T00:00:00 |
docs/18/date |
Old
2022-10-05T00:00:00New
2022-10-06T00:00:00 |
docs/19/date |
Old
2023-03-29T00:00:00New
2023-03-30T00:00:00 |
forecasts |
|
docs/17 |
|
docs/9 |
|
docs/9 |
|
docs/9/date |
Old
2023-07-27T00:00:00New
2023-07-28T00:00:00 |
docs/10 |
|
docs/10 |
|
docs/10/date |
Old
2023-07-27T00:00:00New
2023-07-28T00:00:00 |
docs/11 |
|
docs/11 |
|
docs/11/date |
Old
2023-07-27T00:00:00New
2023-07-28T00:00:00 |
docs/5 |
|
docs/6 |
|
docs/7 |
|
docs/8 |
|
docs/8 |
|
docs/8/date |
Old
2023-05-30T00:00:00New
2023-07-27T00:00:00 |
docs/9 |
|
docs/9 |
|
docs/9/date |
Old
2023-05-30T00:00:00New
2023-07-27T00:00:00 |
docs/10 |
|
docs/10/date |
Old
2023-05-30T00:00:00New
2023-07-27T00:00:00 |
docs/11 |
|
docs/11/date |
Old
2023-05-30T00:00:00New
2023-07-27T00:00:00 |
docs/12 |
|
docs/12/date |
Old
2023-05-30T00:00:00New
2023-07-28T00:00:00 |
docs/12 |
|
docs/11 |
|