BETA

100 Amendments of Rob ROOKEN related to 2022/0155(COD)

Amendment 277 #
Proposal for a regulation
The European Parliament rejects the Commission proposal (COM(2022)0209).
2023/07/28
Committee: LIBE
Amendment 279 #
Proposal for a regulation
The European Parliament rejects the Commission proposal (COM(2022)0209).
2023/07/28
Committee: LIBE
Amendment 311 #
Proposal for a regulation
Recital 5
(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available number independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner.
2023/07/28
Committee: LIBE
Amendment 326 #
Proposal for a regulation
Recital 14
(14) With a view to minimising the risk that their services are misused for the dissemination of known or new child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available number independent interpersonal communications services should assess such risk for each of the services that they offer in the Union. To guide their risk assessment, a non- exhaustive list of elements to be taken into account should be provided. To allow for a full consideration of the specific characteristics of the services they offer, providers should be allowed to take account of additional elements where relevant. As risks evolve over time, in function of developments such as those related to technology and the manners in which the services in question are offered and used, it is appropriate to ensure that the risk assessment is updated regularly and when needed for particular reasons.
2023/07/28
Committee: LIBE
Amendment 338 #
Proposal for a regulation
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number independent interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
2023/07/28
Committee: LIBE
Amendment 348 #
Proposal for a regulation
Recital 18
(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number independent interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices, including as established through self- regulatory cooperation, and those contained in guidelines from the Commission. When no risk has been detected after a diligently conducted or updated risk assessment, providers should not be required to take any mitigation measures.
2023/07/28
Committee: LIBE
Amendment 354 #
Proposal for a regulation
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, and where the provider refuses to cooperate with Coordinating Authorities and the Centre, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders as a last resort. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.
2023/07/28
Committee: LIBE
Amendment 374 #
Proposal for a regulation
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted and specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available number independent interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
2023/07/28
Committee: LIBE
Amendment 382 #
Proposal for a regulation
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available number independent interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless ofilored to the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an importantessential tool to guarantee the security and confidentiality of the communications of users, including those of children. Detection orders should not under any circumstances be interpreted as prohibiting, weaking and breaking (including de facto) encryption or (including de facto) leading to the creation of any kind of backdoor. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users. If the provider reasonably believes that complying with a detection order will inevitably lead to undermining the security and confidentiality of the communications of its users, it should suspend its execution and challenge it in accordance with the procedure outlined in article 9.
2023/07/28
Committee: LIBE
Amendment 383 #
Proposal for a regulation
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use ofIn accordance with Article 6a, nothing in this regulation shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encryptied con technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of childrennt or communications through client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provides third party actors with access to the end-to-end encrypted content and communications. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
2023/07/28
Committee: LIBE
Amendment 389 #
Proposal for a regulation
Recital 26 a (new)
(26a) End-to-end encryption is an essential tool to guarantee the security, privacy and confidentiality of the communications between users, including those of children. Any weakening of the end-to-end encryption's effect could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communications shall be understood the processing of any data, that would compromise or put at risk the integrity and confidentiality of the aforementioned end-to-end encrypted content. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side-channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provide third party actors access to the end-to-end encrypted content and communications.
2023/07/28
Committee: LIBE
Amendment 392 #
Proposal for a regulation
Recital 26 a (new)
(26a) The act of ‘breaking’ encryption refers to the act of defeating or bypassing the encryption protocol used to secure a communication. Any access by any third- party that was not meant to access, read or edit the content of that communication that was supposed to be private and secure should be considered as bypassing encryption.
2023/07/28
Committee: LIBE
Amendment 406 #
Proposal for a regulation
Recital 28
(28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliable, as well as to identify false positives and avoid to the extent erroneous reporting to the EU Centre, providers should ensure human oversight and, where necessary, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available number independent interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential solicitation of children.
2023/07/28
Committee: LIBE
Amendment 409 #
Proposal for a regulation
Recital 29
(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that they may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potentialnumber independent interpersonal communications services should report on online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonableserious grounds to believe that a particular activity may constitute online child sexual abuse. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest against child sexual abuse, or activities conducted on the providers’ own initiative. TWherever possible, those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them.
2023/07/28
Committee: LIBE
Amendment 425 #
Proposal for a regulation
Recital 35
(35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available number independent interpersonal communications services in accordance with this Regulation.
2023/07/28
Committee: LIBE
Amendment 456 #
Proposal for a regulation
Recital 60
(60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available number independent interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of hosting services, the provision of assistance to Coordinating Authorities, as well as the generation and sharing of knowledge and expertise related to online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 462 #
Proposal for a regulation
Recital 63
(63) For the purpose of ensuring the traceability of the reporting process and of any follow-up activity undertaken based on reporting, as well as of allowing for the provision of feedback on reporting to providers of hosting services and providers of publicly available number independent interpersonal communications services, generating statistics concerning reports and the reliable and swift management and processing of reports, the EU Centre should create a dedicated database of such reports. To be able to fulfil the above purposes, that database should also contain relevant information relating to those reports, such as the indicators representing the material and ancillary tags, which can indicate, for example, the fact that a reported image or video is part of a series of images and videos depicting the same victim or victims.
2023/07/28
Committee: LIBE
Amendment 468 #
Proposal for a regulation
Recital 65
(65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks, reports should pass through the EU Centre. The EU Centre should assess those reports in order to identify those that are manifestly unfounded, that is, where it is immediately evident, without any substantive legal or factual analysis, that the reported activities do not constitute online child sexual abuse. Where the report is manifestly unfounded, the EU Centre should provide feedback to the reporting provider of hosting services or provider of publicly available number independent interpersonal communications services in order to allow for improvements in the technologies and processes used and for other appropriate steps, such as reinstating material wrongly removed. As every report could be an important means to investigate and prosecute the child sexual abuse offences concerned and to rescue the victim of the abuse, reports should be processed as quickly as possible.
2023/07/28
Committee: LIBE
Amendment 489 #
Proposal for a regulation
Recital 75
(75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available number independent interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question in gathering that information.
2023/07/28
Committee: LIBE
Amendment 493 #
Proposal for a regulation
Recital 78
(78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45provides for a temporary solution in respect of the use of technologies by certain providers of publicly available number independentinterpersonal communications services for the purpose of combating online child sexual abuse, pending the preparation and adoption of a long-term legal framework. This Regulation provides that long-term legal framework. Regulation (EU) 2021/1232 should therefore be repealed. _________________ 45 Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number- independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse (OJ L 274, 30.7.2021, p. 41).
2023/07/28
Committee: LIBE
Amendment 499 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of interpersonal communication services to detect and report online child sexual abuse where there is reasonable cause to suspect such illegal behaviour;
2023/07/28
Committee: LIBE
Amendment 504 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of interpersonal communication services to detect and report online child sexual abuscooperate with the EU Centre;
2023/07/28
Committee: LIBE
Amendment 505 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point c
(c) obligations on providers of hosting services to remove or disable access to known child sexual abuse material on their services;
2023/07/28
Committee: LIBE
Amendment 514 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d
(d) obligations on providers of internet access services to disable access to known child sexual abuse material;
2023/07/28
Committee: LIBE
Amendment 540 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
4a. To ensure fundamental rights laid down in the European Union's, the Council of Europe's and the United Nation's human rights charters, core fundaments of our democratic society and the rule of law - citizens' right to privacy and private correspondence must be upheld. Therefore, detection orders can only be issued towards persons suspected of criminal activity. There shall be no general monitoring of ordinary law- abiding citizens and users of interpersonal communication services private messages.
2023/07/28
Committee: LIBE
Amendment 614 #
Proposal for a regulation
Article 3 – paragraph 1
1. Providers of hosting services and providers of number independent interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the any recurrent systemic risk of use of the service for the purpose of online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 619 #
Proposal for a regulation
Article 3 – paragraph 1 b (new)
1b. Risk assessment obligations shall always be strictly necessaary and proportionate, and shall never entail a general monitoring obligation, an obligation to seek knowledge about the content of private communications, nor an obligation for providers to seek knowledge of illegal content.
2023/07/28
Committee: LIBE
Amendment 621 #
Proposal for a regulation
Article 3 – paragraph 2 – point a
(a) any previouslyrecurrent systemic risks and identified instances of use of its services for the purpose of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 629 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability of functionalities to address the systemic risks referred to in paragraph 1, including through the following:
2023/07/28
Committee: LIBE
Amendment 637 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
– functionalities enabling age verification;deleted
2023/07/28
Committee: LIBE
Amendment 647 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily accessible and age-appropriate;
2023/07/28
Committee: LIBE
Amendment 667 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point i
(i) the extent to which the service is used or is likely to be used by childrentargeting child users;
2023/07/28
Committee: LIBE
Amendment 671 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point ii
(ii) where the service is used by childrentargeting child users, the different age groups of the child users and the risk of solicitation of children in relation to those age groups;
2023/07/28
Committee: LIBE
Amendment 672 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii
(iii) the availability of functionalities creating or reinforcing the risk of solicitation of children, including the following functionalities: — enabling users to search for other users and, in particular, for adult users to search for child users; — enabling users to establish contact with other users directly, in particular through private communications; — enabling users to share images or videos with other users, in particular through private communications.deleted
2023/07/28
Committee: LIBE
Amendment 678 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 1
– enabling users to search for other users and, in particular, for adult users to search foron services directly targeting child users;
2023/07/28
Committee: LIBE
Amendment 680 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 2
– enabling users to establish contact with other users directlyon services directly targeting child users, in particular through private communications;
2023/07/28
Committee: LIBE
Amendment 683 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 3
– enabling users to share images or videos with otheron services directly targeting child users, in particular through private communications.
2023/07/28
Committee: LIBE
Amendment 715 #
Proposal for a regulation
Article 3 – paragraph 5
5. The risk assessment shall include an assessment of any potential remaining risk that, after taking the mitigation measures pursuant to Article 4, the service is used for the purpose of online child sexual abuse.deleted
2023/07/28
Committee: LIBE
Amendment 721 #
Proposal for a regulation
Article 3 – paragraph 6
6. The Commission, in cooperation with Coordinating Authorities, the European Data Protection Board and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
2023/07/28
Committee: LIBE
Amendment 729 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number independent interpersonal communications services shall take reasonable mitigation measures, tailored to the systemic risks identified pursuant to Article 3, to minimise thatsuch risks. Such measures, where applicable and technically feasible without being detrimental to the technical integrity or operating model of the platform or service, and without being detrimental to the confidentiality of the communications on that service, shall include some or all of the following:
2023/07/28
Committee: LIBE
Amendment 778 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
1a. Risk mitigation obligations shall always be strictly necessaary and proportionate, and shall never entail a general monitoring obligation, an obligation to seek knowledge about the content of private communications, contrary to Article 5 of the ePrivacy Directive, nor an obligation for providers to seek knowledge of illegal content.
2023/07/28
Committee: LIBE
Amendment 791 #
Proposal for a regulation
Article 4 – paragraph 2 – point c
(c) applied in a diligent and non- discriminatory manner, having due regardwith full assessment, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected and in particular, that they respect rights to privacy, data protection and freedom of expression and protect the integrity and security of platforms and services, including those that are end-to-end encrypted;
2023/07/28
Committee: LIBE
Amendment 802 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures.deleted
2023/07/28
Committee: LIBE
Amendment 810 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of number independent interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a systemic risk of use of their services for the purpose of the solicitation of children, shallmay take the necessary age verification and age assessmentproportionnate measures to reliably identify child users on their services, enabling them to take the mitigation measur or to give the child user the opportunity to consensually identify themselves.
2023/07/28
Committee: LIBE
Amendment 817 #
Proposal for a regulation
Article 4 – paragraph 4
4. Providers of hosting services and providers of number independent interpersonal communications services shall clearly describe in their terms and conditions the mitigation measures that they have taken. That description shall not include information that may reduce the effectiveness of the mitigation measures.
2023/07/28
Committee: LIBE
Amendment 832 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number independent interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
2023/07/28
Committee: LIBE
Amendment 851 #
Proposal for a regulation
Article 6
Obligations for software application 1. Providers of software application stores shall: (a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children; (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; (c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b). 2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3. 3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures. 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.Article 6 deleted stores
2023/07/28
Committee: LIBE
Amendment 856 #
Proposal for a regulation
Article 6
1. Providers of software application stores shall: (a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children; (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; (c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b). 2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3. 3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures. 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular Article 6 deleted Obligations for software application sto relevant technological developments and to the manners in which the services covered by those provisions are offered and used.s
2023/07/28
Committee: LIBE
Amendment 875 #
Proposal for a regulation
Article 6 a (new)
Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communcations shall be understood the processing of any data that would compromise or put at risk the integrity and confidentiality of the content and communications in the end- to-end encryption. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communications services provides third party actors access to the end-to-end encrypted content.
2023/07/28
Committee: LIBE
Amendment 877 #
Proposal for a regulation
Article 6 a (new)
Article6a Encrypted services and metadata processing 1. Nothing in this Regulation shall be interpreted as prohibiting or weakening end-to-end encryption.
2023/07/28
Committee: LIBE
Amendment 879 #
Proposal for a regulation
Article 7
[...]deleted
2023/07/28
Committee: LIBE
Amendment 896 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
1a. Detection orders shall only target providers of hosting services or providers of number independent interpersonal communications services that fail to comply with the requirements outlined in articles 3, 4 and 5 of this Regulation. They shall only be issued once all the measures in the abovementioned articles have been exhausted and target providers that can reasonably be expected to have the technical and operational ability to act.
2023/07/28
Committee: LIBE
Amendment 957 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a
(a) there is clear evidence of a significantystemic risk ofthat the service is being used for the purpose of online child sexual abuse, within the meaning of paragraphs 5, 6 and 7, as applicable;
2023/07/28
Committee: LIBE
Amendment 969 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
(ba) The detection order does not affect the security and confidentiality of communications on a general scale;
2023/07/28
Committee: LIBE
Amendment 971 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b b (new)
(bb) The technology used to protect the communication, such as any kind of encryption, shall not be affected or undermined by the detection order;
2023/07/28
Committee: LIBE
Amendment 972 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b c (new)
(bc) All measures outlined in articles 3, 4 and 5 have been exhausted.
2023/07/28
Committee: LIBE
Amendment 973 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b d (new)
(bd) Nothing in the order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider.
2023/07/28
Committee: LIBE
Amendment 996 #
Proposal for a regulation
Article 7 – paragraph 5
5. As regards detection orders concerning the dissemination of known child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely, despite any mitigation measures that the provider may have taken or will take, that the service is used, to an appreciable extent for the dissemination of known child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent for the dissemination of known child sexual abuse material.deleted
2023/07/28
Committee: LIBE
Amendment 1002 #
Proposal for a regulation
Article 7 – paragraph 6
6. As regards detection orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the dissemination of new child sexual abuse material; (c) for services other than those enabling the live transmission of pornographic performances as defined in Article 2, point (e), of Directive 2011/93/EU: (1) a detection order concerning the dissemination of known child sexual abuse material has been issued in respect of the service; (2) the provider submitted a significant number of reports concerning known child sexual abuse material, detected through the measures taken to execute the detection order referred to in point (1), pursuant to Article 12.deleted
2023/07/28
Committee: LIBE
Amendment 1008 #
Proposal for a regulation
Article 7 – paragraph 7
7. As regards detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) the provider qualifies as a provider of interpersonal communication services; (b) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the solicitation of children; (c) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the solicitation of children. The detection orders concerning the solicitation of children shall apply only to interpersonal communications where one of the users is a child user.deleted
2023/07/28
Committee: LIBE
Amendment 1011 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1
As regards detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) the provider qualifies as a provider of interpersonal communication services; (b) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the solicitation of children; (c) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the solicitation of children.deleted
2023/07/28
Committee: LIBE
Amendment 1052 #
Proposal for a regulation
Article 8
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1075 #
Proposal for a regulation
Article 8 – paragraph 1 – point e a (new)
(ea) the person or group of persons covered by the detection order and specifics of the suspicion of illegal activities;
2023/07/28
Committee: LIBE
Amendment 1099 #
Proposal for a regulation
Article 9
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1104 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of hosting services and providers of number independent interpersonal communications services that have received a detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detection order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
2023/07/28
Committee: LIBE
Amendment 1124 #
Proposal for a regulation
Article 10
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1157 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) (e) focused on communications where there is an established suspicion of illegal activity and the technologies shall not lead to general monitoring of private communications;
2023/07/28
Committee: LIBE
Amendment 1167 #
Proposal for a regulation
Article 10 – paragraph 4
4. The provider shall: (a) take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly necessary to execute the detection orders addressed to them; (b) establish effective internal procedures to prevent and, where necessary, detect and remedy any misuse of the technologies, indicators and personal data and other data referred to in point (a), including unauthorized access to, and unauthorised transfers of, such personal data and other data; (c) ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, human intervention; (d) establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner; (e) inform the Coordinating Authority, at the latest one month before the start date specified in the detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3); (f) regularly review the functioning of the measures referred to in points (a), (b), (c) and (d) of this paragraph and adjust them where necessary to ensure that the requirements set out therein are met, as well as document the review process and the outcomes thereof and include that information in the report referred to in Article 9(3).deleted
2023/07/28
Committee: LIBE
Amendment 1206 #
Proposal for a regulation
Article 11
Guidelines regarding detection The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments and the manners in which the services covered by those provisions are offered and used.Article 11 deleted obligations
2023/07/28
Committee: LIBE
Amendment 1218 #
Proposal for a regulation
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of number independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
2023/07/28
Committee: LIBE
Amendment 1236 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
2023/07/28
Committee: LIBE
Amendment 1274 #
Proposal for a regulation
Article 14 – paragraph 2
2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
2023/07/28
Committee: LIBE
Amendment 1298 #
Proposal for a regulation
Article 16
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1308 #
Proposal for a regulation
Article 17
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1324 #
Proposal for a regulation
Article 18
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1327 #
Proposal for a regulation
Article 19 – paragraph 1
Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying,or because of the voluntary measures they take to removing,e or disabling ofe access to, blocking or reporting online child sexual abuse in accordance with those requirement child sexual abuse material on their services.
2023/07/28
Committee: LIBE
Amendment 1379 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Providers of hosting services and providers of number independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
2023/07/28
Committee: LIBE
Amendment 1527 #
Proposal for a regulation
Chapter IV – title
IV EUJOINT CENTRE TO PREVENT AND COMBAT CHILD SEXUAL ABUSE
2023/07/28
Committee: LIBE
Amendment 1529 #
Proposal for a regulation
Article 40 – title
Establishment and scope of action of the EUJoint Centre
2023/07/28
Committee: LIBE
Amendment 1530 #
1. A European Unionn intergovernmental Agency to prevent and combat child sexual abuse, the EUJoint Centre on Child Sexual Abuse, is established.
2023/07/28
Committee: LIBE
Amendment 1533 #
Proposal for a regulation
Article 40 – paragraph 2
2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online. Its remit and powers shall not be expanded without prior evaluation and unanimous decision by Member States.
2023/07/28
Committee: LIBE
Amendment 1534 #
Proposal for a regulation
Article 40 – paragraph 2
2. The EUJoint Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
2023/07/28
Committee: LIBE
Amendment 1537 #
Proposal for a regulation
Article 41 – paragraph 1
1. The EUJoint Centre shall be a body of the Unionn intergovernmental body with legal personality in a Member State.
2023/07/28
Committee: LIBE
Amendment 1538 #
Proposal for a regulation
Article 41 – paragraph 2
2. In each of the Member States the EU Centre shall enjoy the most extensive legal capacfully comply wityh accorded to legal persons under their laws. It may, in particularnd respect their laws. It may, with the consent of the Member State concerned, acquire and dispose of movable and immovable property and be party to legal proceedings.
2023/07/28
Committee: LIBE
Amendment 1557 #
Proposal for a regulation
Article 43 – paragraph 1 – point 2 – point c
(c) giving providers of hosting services and providers of number independent interpersonal communications services that received a detection order access to the relevant databases of indicators in accordance with Article 46;
2023/07/28
Committee: LIBE
Amendment 1608 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
(a) relevant indicators, consisting of digital identifiers to be used to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, on hosting services and number independent interpersonal communications services, generated by the EU Centre in accordance with paragraph 3;
2023/07/28
Committee: LIBE
Amendment 1619 #
Proposal for a regulation
Article 45 – paragraph 1
1. The EU Centre shall create, maintain and operate a database for the reports submitted to it by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 12(1) and assessed and processed in accordance with Article 48.
2023/07/28
Committee: LIBE
Amendment 1624 #
Proposal for a regulation
Article 45 – paragraph 2 – point c
(c) where the EU Centre forwarded the report in accordance with Article 48(3), the date and time of such forwarding and the name of the competent law enforcement authority or authorities to which it forwarded the report or, where applicable, information on the reasons for forwarding the report solely to Europol for further analysis;deleted
2023/07/28
Committee: LIBE
Amendment 1635 #
Proposal for a regulation
Article 46 – paragraph 2
2. The EU Centre shall give providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to execute the detection or blocking orders that they received in accordance with Articles 7 or 16. It shall take measures to ensure that such access remains limited to what is strictly necessary for the period of application of the detection or blocking orders concerned and that such access does not in any way endanger the proper operation of those databases and the accuracy and security of the data contained therein.
2023/07/28
Committee: LIBE
Amendment 1663 #
Proposal for a regulation
Article 48 – paragraph 1
1. The EU Centre shall expeditiously assess and process reports submitted by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 12 to determine whether the reports are manifestly unfounded or are to be forwarded.
2023/07/28
Committee: LIBE
Amendment 1684 #
Proposal for a regulation
Article 49
1. The EU Centre shall have the power to conduct searches on hosting services for the dissemination of publicly accessible child sexual abuse material, using the relevant indicators from the database of indicators referred to in Article 44(1), points (a) and (b), in the following situations: (a) where so requested to support a victim by verifying whether the provider of hosting services removed or disabled access to one or more specific items of known child sexual abuse material depicting the victim, in accordance with Article 21(4), point (c); (b) where so requested to assist a Coordinating Authority by verifying the possible need for the issuance of a detection order or a removal order in respect of a specific service or the effectiveness of a detection order or a removal order that the Coordinating Authority issued, in accordance with Article 25(7), points (c) and (d), respectively. 2. The EU Centre shall have the power to notify, after having conducted the searches referred to in paragraph 1, providers of hosting services of the presence of one or more specific items of known child sexual abuse material on their services and request them to remove or disable access to that item or those items, for the providers’ voluntary consideration. The request shall clearly set out the identification details of the EU Centre and a contact point, the necessary information for the identification of the item or items, as well as the reasons for the request. The request shall also clearly state that it is for the provider’s voluntary consideration. 3. Where so requested by a competent law enforcement authority of a Member State in order to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences, the EU Centre shall not submit a notice, for as long as necessary to avoid such interference but no longer than 18 months.Article 49 deleted Searches and notification
2023/07/28
Committee: LIBE
Amendment 1699 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of number independent interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1).
2023/07/28
Committee: LIBE
Amendment 1798 #
Proposal for a regulation
Article 66 – paragraph 1
1. The Technology Committee shall consist of technical, privacy and data protection experts appointed by the Management Board in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union.
2023/07/28
Committee: LIBE
Amendment 1808 #
Proposal for a regulation
Article 83 – paragraph 1 – introductory part
1. Providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services shall collect data on the following topics and make that information available to the EU Centre upon request:
2023/07/28
Committee: LIBE
Amendment 1836 #
Proposal for a regulation
Article 83 – paragraph 2 – point b
(b) the most important and recurrent risks of online child sexual abuse, as reported by providers of hosting services and providers of number independent interpersonal communications services in accordance with Article 3 or identified through other information available to the Coordinating Authority;
2023/07/28
Committee: LIBE
Amendment 1841 #
Proposal for a regulation
Article 83 – paragraph 2 – point c
(c) a list of the providers of hosting services and providers of number independent interpersonal communications services to which the Coordinating Authority addressed a detection order in accordance with Article 7;
2023/07/28
Committee: LIBE
Amendment 1860 #
Proposal for a regulation
Article 83 – paragraph 3 – point c
(c) the total number of reports submitted to the EU Centre in accordance with Article 12, broken down by provider of hosting services and provider of number independent interpersonal communications services that submitted the report and by Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3);
2023/07/28
Committee: LIBE
Amendment 1874 #
Proposal for a regulation
Article 83 – paragraph 4
4. The providers of hosting services, providers of number independent interpersonal communications services and providers of internet access services, the Coordinating Authorities and the EU Centre shall ensure that the data referred to in paragraphs 1, 2 and 3, respectively, is stored no longer than is necessary for the transparency reporting referred to in Article 84. The data stored shall not contain any personal data.
2023/07/28
Committee: LIBE
Amendment 1894 #
Proposal for a regulation
Annex I – Section 4 – paragraph 3
Where the detection order concerns the solicitation of children, in accordance with Article 7(7), last subparagraph, of the Regulation, the detection order applies only to publicly available number independent interpersonal communications where one of the users is a child user, as defined in Article 2, point (i), of the Regulation.
2023/07/28
Committee: LIBE
Amendment 1905 #
Proposal for a regulation
Annex III – Section 2 – point 4
4) Other available data related to the reported potential online child sexual abuse, including metadata related to media files (date, time, time zone): (Text – attach data as necessary)deleted
2023/07/28
Committee: LIBE