100 Amendments of Rob ROOKEN related to 2022/0155(COD)
Amendment 277 #
Proposal for a regulation
–
–
The European Parliament rejects the Commission proposal (COM(2022)0209).
Amendment 279 #
Proposal for a regulation
–
–
The European Parliament rejects the Commission proposal (COM(2022)0209).
Amendment 311 #
Proposal for a regulation
Recital 5
Recital 5
(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available number independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner.
Amendment 326 #
Proposal for a regulation
Recital 14
Recital 14
(14) With a view to minimising the risk that their services are misused for the dissemination of known or new child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available number independent interpersonal communications services should assess such risk for each of the services that they offer in the Union. To guide their risk assessment, a non- exhaustive list of elements to be taken into account should be provided. To allow for a full consideration of the specific characteristics of the services they offer, providers should be allowed to take account of additional elements where relevant. As risks evolve over time, in function of developments such as those related to technology and the manners in which the services in question are offered and used, it is appropriate to ensure that the risk assessment is updated regularly and when needed for particular reasons.
Amendment 338 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number independent interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 348 #
Proposal for a regulation
Recital 18
Recital 18
(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number independent interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices, including as established through self- regulatory cooperation, and those contained in guidelines from the Commission. When no risk has been detected after a diligently conducted or updated risk assessment, providers should not be required to take any mitigation measures.
Amendment 354 #
Proposal for a regulation
Recital 20
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, and where the provider refuses to cooperate with Coordinating Authorities and the Centre, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders as a last resort. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.
Amendment 374 #
Proposal for a regulation
Recital 23
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted and specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available number independent interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
Amendment 382 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available number independent interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless ofilored to the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an importantessential tool to guarantee the security and confidentiality of the communications of users, including those of children. Detection orders should not under any circumstances be interpreted as prohibiting, weaking and breaking (including de facto) encryption or (including de facto) leading to the creation of any kind of backdoor. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users. If the provider reasonably believes that complying with a detection order will inevitably lead to undermining the security and confidentiality of the communications of its users, it should suspend its execution and challenge it in accordance with the procedure outlined in article 9.
Amendment 383 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use ofIn accordance with Article 6a, nothing in this regulation shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encryptied con technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of childrennt or communications through client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provides third party actors with access to the end-to-end encrypted content and communications. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
Amendment 389 #
Proposal for a regulation
Recital 26 a (new)
Recital 26 a (new)
(26a) End-to-end encryption is an essential tool to guarantee the security, privacy and confidentiality of the communications between users, including those of children. Any weakening of the end-to-end encryption's effect could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communications shall be understood the processing of any data, that would compromise or put at risk the integrity and confidentiality of the aforementioned end-to-end encrypted content. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side-channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provide third party actors access to the end-to-end encrypted content and communications.
Amendment 392 #
Proposal for a regulation
Recital 26 a (new)
Recital 26 a (new)
(26a) The act of ‘breaking’ encryption refers to the act of defeating or bypassing the encryption protocol used to secure a communication. Any access by any third- party that was not meant to access, read or edit the content of that communication that was supposed to be private and secure should be considered as bypassing encryption.
Amendment 406 #
Proposal for a regulation
Recital 28
Recital 28
(28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliable, as well as to identify false positives and avoid to the extent erroneous reporting to the EU Centre, providers should ensure human oversight and, where necessary, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available number independent interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential solicitation of children.
Amendment 409 #
Proposal for a regulation
Recital 29
Recital 29
(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that they may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potentialnumber independent interpersonal communications services should report on online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonableserious grounds to believe that a particular activity may constitute online child sexual abuse. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest against child sexual abuse, or activities conducted on the providers’ own initiative. TWherever possible, those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them.
Amendment 425 #
Proposal for a regulation
Recital 35
Recital 35
(35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available number independent interpersonal communications services in accordance with this Regulation.
Amendment 456 #
Proposal for a regulation
Recital 60
Recital 60
(60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available number independent interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of hosting services, the provision of assistance to Coordinating Authorities, as well as the generation and sharing of knowledge and expertise related to online child sexual abuse.
Amendment 462 #
Proposal for a regulation
Recital 63
Recital 63
(63) For the purpose of ensuring the traceability of the reporting process and of any follow-up activity undertaken based on reporting, as well as of allowing for the provision of feedback on reporting to providers of hosting services and providers of publicly available number independent interpersonal communications services, generating statistics concerning reports and the reliable and swift management and processing of reports, the EU Centre should create a dedicated database of such reports. To be able to fulfil the above purposes, that database should also contain relevant information relating to those reports, such as the indicators representing the material and ancillary tags, which can indicate, for example, the fact that a reported image or video is part of a series of images and videos depicting the same victim or victims.
Amendment 468 #
Proposal for a regulation
Recital 65
Recital 65
(65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks, reports should pass through the EU Centre. The EU Centre should assess those reports in order to identify those that are manifestly unfounded, that is, where it is immediately evident, without any substantive legal or factual analysis, that the reported activities do not constitute online child sexual abuse. Where the report is manifestly unfounded, the EU Centre should provide feedback to the reporting provider of hosting services or provider of publicly available number independent interpersonal communications services in order to allow for improvements in the technologies and processes used and for other appropriate steps, such as reinstating material wrongly removed. As every report could be an important means to investigate and prosecute the child sexual abuse offences concerned and to rescue the victim of the abuse, reports should be processed as quickly as possible.
Amendment 489 #
Proposal for a regulation
Recital 75
Recital 75
(75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available number independent interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question in gathering that information.
Amendment 493 #
Proposal for a regulation
Recital 78
Recital 78
(78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45provides for a temporary solution in respect of the use of technologies by certain providers of publicly available number independentinterpersonal communications services for the purpose of combating online child sexual abuse, pending the preparation and adoption of a long-term legal framework. This Regulation provides that long-term legal framework. Regulation (EU) 2021/1232 should therefore be repealed. _________________ 45 Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number- independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse (OJ L 274, 30.7.2021, p. 41).
Amendment 499 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of interpersonal communication services to detect and report online child sexual abuse where there is reasonable cause to suspect such illegal behaviour;
Amendment 504 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of interpersonal communication services to detect and report online child sexual abuscooperate with the EU Centre;
Amendment 505 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point c
Article 1 – paragraph 1 – subparagraph 2 – point c
(c) obligations on providers of hosting services to remove or disable access to known child sexual abuse material on their services;
Amendment 514 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d
Article 1 – paragraph 1 – subparagraph 2 – point d
(d) obligations on providers of internet access services to disable access to known child sexual abuse material;
Amendment 540 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
Article 1 – paragraph 4 a (new)
4a. To ensure fundamental rights laid down in the European Union's, the Council of Europe's and the United Nation's human rights charters, core fundaments of our democratic society and the rule of law - citizens' right to privacy and private correspondence must be upheld. Therefore, detection orders can only be issued towards persons suspected of criminal activity. There shall be no general monitoring of ordinary law- abiding citizens and users of interpersonal communication services private messages.
Amendment 614 #
Proposal for a regulation
Article 3 – paragraph 1
Article 3 – paragraph 1
1. Providers of hosting services and providers of number independent interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the any recurrent systemic risk of use of the service for the purpose of online child sexual abuse.
Amendment 619 #
Proposal for a regulation
Article 3 – paragraph 1 b (new)
Article 3 – paragraph 1 b (new)
1b. Risk assessment obligations shall always be strictly necessaary and proportionate, and shall never entail a general monitoring obligation, an obligation to seek knowledge about the content of private communications, nor an obligation for providers to seek knowledge of illegal content.
Amendment 621 #
Proposal for a regulation
Article 3 – paragraph 2 – point a
Article 3 – paragraph 2 – point a
(a) any previouslyrecurrent systemic risks and identified instances of use of its services for the purpose of online child sexual abuse;
Amendment 629 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability of functionalities to address the systemic risks referred to in paragraph 1, including through the following:
Amendment 637 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
Amendment 647 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily accessible and age-appropriate;
Amendment 667 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point i
Article 3 – paragraph 2 – point e – point i
(i) the extent to which the service is used or is likely to be used by childrentargeting child users;
Amendment 671 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point ii
Article 3 – paragraph 2 – point e – point ii
(ii) where the service is used by childrentargeting child users, the different age groups of the child users and the risk of solicitation of children in relation to those age groups;
Amendment 672 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii
Article 3 – paragraph 2 – point e – point iii
Amendment 678 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 1
Article 3 – paragraph 2 – point e – point iii – indent 1
– enabling users to search for other users and, in particular, for adult users to search foron services directly targeting child users;
Amendment 680 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 2
Article 3 – paragraph 2 – point e – point iii – indent 2
– enabling users to establish contact with other users directlyon services directly targeting child users, in particular through private communications;
Amendment 683 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 3
Article 3 – paragraph 2 – point e – point iii – indent 3
– enabling users to share images or videos with otheron services directly targeting child users, in particular through private communications.
Amendment 715 #
Proposal for a regulation
Article 3 – paragraph 5
Article 3 – paragraph 5
Amendment 721 #
Proposal for a regulation
Article 3 – paragraph 6
Article 3 – paragraph 6
6. The Commission, in cooperation with Coordinating Authorities, the European Data Protection Board and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
Amendment 729 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number independent interpersonal communications services shall take reasonable mitigation measures, tailored to the systemic risks identified pursuant to Article 3, to minimise thatsuch risks. Such measures, where applicable and technically feasible without being detrimental to the technical integrity or operating model of the platform or service, and without being detrimental to the confidentiality of the communications on that service, shall include some or all of the following:
Amendment 778 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
Article 4 – paragraph 1 a (new)
Amendment 791 #
Proposal for a regulation
Article 4 – paragraph 2 – point c
Article 4 – paragraph 2 – point c
(c) applied in a diligent and non- discriminatory manner, having due regardwith full assessment, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected and in particular, that they respect rights to privacy, data protection and freedom of expression and protect the integrity and security of platforms and services, including those that are end-to-end encrypted;
Amendment 802 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
Amendment 810 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
3. Providers of number independent interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a systemic risk of use of their services for the purpose of the solicitation of children, shallmay take the necessary age verification and age assessmentproportionnate measures to reliably identify child users on their services, enabling them to take the mitigation measur or to give the child user the opportunity to consensually identify themselves.
Amendment 817 #
Proposal for a regulation
Article 4 – paragraph 4
Article 4 – paragraph 4
4. Providers of hosting services and providers of number independent interpersonal communications services shall clearly describe in their terms and conditions the mitigation measures that they have taken. That description shall not include information that may reduce the effectiveness of the mitigation measures.
Amendment 832 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
Article 5 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number independent interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 851 #
Proposal for a regulation
Article 6
Article 6
Amendment 856 #
Proposal for a regulation
Article 6
Article 6
Amendment 875 #
Proposal for a regulation
Article 6 a (new)
Article 6 a (new)
Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communcations shall be understood the processing of any data that would compromise or put at risk the integrity and confidentiality of the content and communications in the end- to-end encryption. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communications services provides third party actors access to the end-to-end encrypted content.
Amendment 877 #
Proposal for a regulation
Article 6 a (new)
Article 6 a (new)
Article6a Encrypted services and metadata processing 1. Nothing in this Regulation shall be interpreted as prohibiting or weakening end-to-end encryption.
Amendment 879 #
Proposal for a regulation
Article 7
Article 7
Amendment 896 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
1a. Detection orders shall only target providers of hosting services or providers of number independent interpersonal communications services that fail to comply with the requirements outlined in articles 3, 4 and 5 of this Regulation. They shall only be issued once all the measures in the abovementioned articles have been exhausted and target providers that can reasonably be expected to have the technical and operational ability to act.
Amendment 957 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a
Article 7 – paragraph 4 – subparagraph 1 – point a
(a) there is clear evidence of a significantystemic risk ofthat the service is being used for the purpose of online child sexual abuse, within the meaning of paragraphs 5, 6 and 7, as applicable;
Amendment 969 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
(ba) The detection order does not affect the security and confidentiality of communications on a general scale;
Amendment 971 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b b (new)
Article 7 – paragraph 4 – subparagraph 1 – point b b (new)
(bb) The technology used to protect the communication, such as any kind of encryption, shall not be affected or undermined by the detection order;
Amendment 972 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b c (new)
Article 7 – paragraph 4 – subparagraph 1 – point b c (new)
(bc) All measures outlined in articles 3, 4 and 5 have been exhausted.
Amendment 973 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b d (new)
Article 7 – paragraph 4 – subparagraph 1 – point b d (new)
(bd) Nothing in the order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider.
Amendment 996 #
Proposal for a regulation
Article 7 – paragraph 5
Article 7 – paragraph 5
Amendment 1002 #
Proposal for a regulation
Article 7 – paragraph 6
Article 7 – paragraph 6
Amendment 1008 #
Proposal for a regulation
Article 7 – paragraph 7
Article 7 – paragraph 7
Amendment 1011 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1
Article 7 – paragraph 7 – subparagraph 1
Amendment 1052 #
Proposal for a regulation
Article 8
Article 8
Amendment 1075 #
Proposal for a regulation
Article 8 – paragraph 1 – point e a (new)
Article 8 – paragraph 1 – point e a (new)
(ea) the person or group of persons covered by the detection order and specifics of the suspicion of illegal activities;
Amendment 1099 #
Proposal for a regulation
Article 9
Article 9
Amendment 1104 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of hosting services and providers of number independent interpersonal communications services that have received a detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detection order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
Amendment 1124 #
Proposal for a regulation
Article 10
Article 10
Amendment 1157 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
Article 10 – paragraph 3 – point d a (new)
(da) (e) focused on communications where there is an established suspicion of illegal activity and the technologies shall not lead to general monitoring of private communications;
Amendment 1167 #
Proposal for a regulation
Article 10 – paragraph 4
Article 10 – paragraph 4
Amendment 1206 #
Proposal for a regulation
Article 11
Article 11
Amendment 1218 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of number independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
Amendment 1236 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 1274 #
Proposal for a regulation
Article 14 – paragraph 2
Article 14 – paragraph 2
2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
Amendment 1298 #
Proposal for a regulation
Article 16
Article 16
Amendment 1308 #
Proposal for a regulation
Article 17
Article 17
Amendment 1324 #
Proposal for a regulation
Article 18
Article 18
Amendment 1327 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying,or because of the voluntary measures they take to removing,e or disabling ofe access to, blocking or reporting online child sexual abuse in accordance with those requirement child sexual abuse material on their services.
Amendment 1379 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Providers of hosting services and providers of number independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
Amendment 1527 #
Proposal for a regulation
Chapter IV – title
Chapter IV – title
IV EUJOINT CENTRE TO PREVENT AND COMBAT CHILD SEXUAL ABUSE
Amendment 1529 #
Proposal for a regulation
Article 40 – title
Article 40 – title
Establishment and scope of action of the EUJoint Centre
Amendment 1530 #
1. A European Unionn intergovernmental Agency to prevent and combat child sexual abuse, the EUJoint Centre on Child Sexual Abuse, is established.
Amendment 1533 #
Proposal for a regulation
Article 40 – paragraph 2
Article 40 – paragraph 2
2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online. Its remit and powers shall not be expanded without prior evaluation and unanimous decision by Member States.
Amendment 1534 #
Proposal for a regulation
Article 40 – paragraph 2
Article 40 – paragraph 2
2. The EUJoint Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
Amendment 1537 #
Proposal for a regulation
Article 41 – paragraph 1
Article 41 – paragraph 1
1. The EUJoint Centre shall be a body of the Unionn intergovernmental body with legal personality in a Member State.
Amendment 1538 #
Proposal for a regulation
Article 41 – paragraph 2
Article 41 – paragraph 2
2. In each of the Member States the EU Centre shall enjoy the most extensive legal capacfully comply wityh accorded to legal persons under their laws. It may, in particularnd respect their laws. It may, with the consent of the Member State concerned, acquire and dispose of movable and immovable property and be party to legal proceedings.
Amendment 1557 #
Proposal for a regulation
Article 43 – paragraph 1 – point 2 – point c
Article 43 – paragraph 1 – point 2 – point c
(c) giving providers of hosting services and providers of number independent interpersonal communications services that received a detection order access to the relevant databases of indicators in accordance with Article 46;
Amendment 1608 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
Article 44 – paragraph 2 – point a
(a) relevant indicators, consisting of digital identifiers to be used to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, on hosting services and number independent interpersonal communications services, generated by the EU Centre in accordance with paragraph 3;
Amendment 1619 #
Proposal for a regulation
Article 45 – paragraph 1
Article 45 – paragraph 1
1. The EU Centre shall create, maintain and operate a database for the reports submitted to it by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 12(1) and assessed and processed in accordance with Article 48.
Amendment 1624 #
Proposal for a regulation
Article 45 – paragraph 2 – point c
Article 45 – paragraph 2 – point c
Amendment 1635 #
Proposal for a regulation
Article 46 – paragraph 2
Article 46 – paragraph 2
2. The EU Centre shall give providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to execute the detection or blocking orders that they received in accordance with Articles 7 or 16. It shall take measures to ensure that such access remains limited to what is strictly necessary for the period of application of the detection or blocking orders concerned and that such access does not in any way endanger the proper operation of those databases and the accuracy and security of the data contained therein.
Amendment 1663 #
Proposal for a regulation
Article 48 – paragraph 1
Article 48 – paragraph 1
1. The EU Centre shall expeditiously assess and process reports submitted by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 12 to determine whether the reports are manifestly unfounded or are to be forwarded.
Amendment 1684 #
Proposal for a regulation
Article 49
Article 49
Amendment 1699 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of number independent interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1).
Amendment 1798 #
Proposal for a regulation
Article 66 – paragraph 1
Article 66 – paragraph 1
1. The Technology Committee shall consist of technical, privacy and data protection experts appointed by the Management Board in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union.
Amendment 1808 #
Proposal for a regulation
Article 83 – paragraph 1 – introductory part
Article 83 – paragraph 1 – introductory part
1. Providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services shall collect data on the following topics and make that information available to the EU Centre upon request:
Amendment 1836 #
Proposal for a regulation
Article 83 – paragraph 2 – point b
Article 83 – paragraph 2 – point b
(b) the most important and recurrent risks of online child sexual abuse, as reported by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 3 or identified through other information available to the Coordinating Authority;
Amendment 1841 #
Proposal for a regulation
Article 83 – paragraph 2 – point c
Article 83 – paragraph 2 – point c
(c) a list of the providers of hosting services and providers of number independent interpersonal communications services to which the Coordinating Authority addressed a detection order in accordance with Article 7;
Amendment 1860 #
Proposal for a regulation
Article 83 – paragraph 3 – point c
Article 83 – paragraph 3 – point c
(c) the total number of reports submitted to the EU Centre in accordance with Article 12, broken down by provider of hosting services and provider of number independent interpersonal communications services that submitted the report and by Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3);
Amendment 1874 #
Proposal for a regulation
Article 83 – paragraph 4
Article 83 – paragraph 4
4. The providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services, the Coordinating Authorities and the EU Centre shall ensure that the data referred to in paragraphs 1, 2 and 3, respectively, is stored no longer than is necessary for the transparency reporting referred to in Article 84. The data stored shall not contain any personal data.
Amendment 1894 #
Proposal for a regulation
Annex I – Section 4 – paragraph 3
Annex I – Section 4 – paragraph 3
Where the detection order concerns the solicitation of children, in accordance with Article 7(7), last subparagraph, of the Regulation, the detection order applies only to publicly available number independent interpersonal communications where one of the users is a child user, as defined in Article 2, point (i), of the Regulation.
Amendment 1905 #
Proposal for a regulation
Annex III – Section 2 – point 4
Annex III – Section 2 – point 4