BETA

35 Amendments of Moritz KÖRNER related to 2022/0155(COD)

Amendment 192 #
Proposal for a regulation
Recital 17 a (new)
(17 a) Fundamental rights in the digital sphere have to be guaranteed to the same extent as in the offline world. Safety and privacy need to be ensured, amongst others through end-to-end encryption in private online communication and the protection of private content against any kind of general or targeted surveillance, be it by public or private actors.
2023/03/09
Committee: IMCO
Amendment 274 #
Proposal for a regulation
Article 2 a (new)
Article 2 a End-to-End Encryption and Prohibition on General Monitoring 1. End-to-end encryption is essential to guarantee the security, confidentiality of the communications of users, including those of children. Any restrictions of encryption could lead to abuse by malicious actors. Nothing in this Regulation should be interpreted as prohibiting providers of information society services from providing their services applying end-to-end encryption, restricting or undermining such encryption. Member States should not prevent providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third party access. 2. Nothing in this Regulation should undermine the prohibition of general monitoring under EU law.
2023/03/09
Committee: IMCO
Amendment 280 #
Proposal for a regulation
The European Parliament rejects the Commission proposal (COM(2022)0209).
2023/07/28
Committee: LIBE
Amendment 282 #
Proposal for a regulation
Citation 1
Having regard to the Treaty on the Functioning of the European Union, and in particular Article 16 and Article 114 thereof,
2023/07/28
Committee: LIBE
Amendment 287 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
— functionalities enabling age verificationprotection of children and preventing online child sexual abuse;
2023/03/09
Committee: IMCO
Amendment 349 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures.deleted
2023/03/09
Committee: IMCO
Amendment 370 #
Proposal for a regulation
Recital 22
(22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
2023/07/28
Committee: LIBE
Amendment 374 #
Proposal for a regulation
Article 5 a (new)
Article 5 a User notification mechanism 1. Without prejudice to Article 16 of Regulation (EU) 2022/2065, relevant information society service providers shall establish mechanisms or use existing mechanisms to allow any individual or entity to notify them of the presence on their service of potential online child sexual abuse, in particular of new child sexual abuse material and solicitation of children for sexual purposes. Those mechanisms shall be easy to access, user- and child-friendly, and allow for the submission of the notification exclusively by electronic means. Providers shall ensure that sufficient human and financial resources are allocated to ensure that the notifications are effectively processed in a timely manner. 2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of notifications to flag to the provider of a relevant information society service potential online child sexual abuse on the service, allowing that provider to identify alleged online child sexual abuse without a detailed legal examination and containing a clear indication of the exact electronic location of that information, and, where necessary and possible, additional information enabling the identification of the illegal content adapted to the type of content. 3. Where the notification contains an electronic contact information of the individual or entity that submitted it, the provider of the relevant information society services shall, without undue delay, send a confirmation of receipt of the notification and inform that individual or entity of its decision and actions taken in relation to the notification.
2023/03/09
Committee: IMCO
Amendment 375 #
Proposal for a regulation
Article 6
Obligations for software application 1. Providers of software application stores shall: (a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children; (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; (c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b). 2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3. 3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures. 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.Article 6 deleted stores
2023/03/09
Committee: IMCO
Amendment 390 #
Proposal for a regulation
Chapter II – Section 2
[...]deleted
2023/03/09
Committee: IMCO
Amendment 392 #
Proposal for a regulation
Article 7
[...]deleted
2023/03/09
Committee: IMCO
Amendment 417 #
Proposal for a regulation
Recital 30
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
2023/07/28
Committee: LIBE
Amendment 458 #
[...]deleted
2023/03/09
Committee: IMCO
Amendment 485 #
Proposal for a regulation
Article 9
[...]deleted
2023/03/09
Committee: IMCO
Amendment 496 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market. by persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
2023/07/28
Committee: LIBE
Amendment 503 #
Proposal for a regulation
Article 10
[...]deleted
2023/03/09
Committee: IMCO
Amendment 530 #
Proposal for a regulation
Article 11
Guidelines regarding detection The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments and the manners in which the services covered by those provisions are offered and used.Article 11 deleted obligations
2023/03/09
Committee: IMCO
Amendment 553 #
Proposal for a regulation
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse materialRemoval orders shall be issued by judicial authorities in line with Article 9 on Orders to act against illegal content of the Regulation (EU) 2022/2065.
2023/03/09
Committee: IMCO
Amendment 556 #
Proposal for a regulation
Article 14 – paragraph 2
2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof.deleted
2023/03/09
Committee: IMCO
Amendment 594 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) “person suspected of being involved in child sexual abuse” means an identified individual person about whom verifiable adequate evidence exists, which gives rise to the suspicion that that person has committed a child sexual abuse offence, attempted to commit a child sexual abuse offence, or prepared by committing a criminal offence to commit a child sexual abuse offence;
2023/07/28
Committee: LIBE
Amendment 596 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(qb) 'person disqualified from exercising activities involving children' means an identified individual person, who, in line with Article 10 of Directive 2011/93/EU, is temporarily or permanenently disqualified from exercising activities involving direct and regular contacts with children;
2023/07/28
Committee: LIBE
Amendment 807 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take threasonable and proportionate mitigation measures.
2023/07/28
Committee: LIBE
Amendment 861 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessinginform the software application provider concerned and the EU Centre about the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;
2023/07/28
Committee: LIBE
Amendment 868 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
(c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b).deleted
2023/07/28
Committee: LIBE
Amendment 870 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
1a. Providers of software applications who have been informed that in relation to their software applications a significant risk of use of the service concerned for the purpose of the solicitation of children has been identified, shall take reasonable and proportionate mitigation measures.
2023/07/28
Committee: LIBE
Amendment 890 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service in the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
2023/07/28
Committee: LIBE
Amendment 892 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service.
2023/07/28
Committee: LIBE
Amendment 951 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority or independent administrative authority shall issue the detection order where it considers that the following conditions are met:
2023/07/28
Committee: LIBE
Amendment 1128 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that have received a detection order concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
2023/07/28
Committee: LIBE
Amendment 1266 #
Proposal for a regulation
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1)courts identified as constituting child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1269 #
Proposal for a regulation
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1294 #
Proposal for a regulation
Chapter II – Section 5
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1332 #
Proposal for a regulation
Article 19 a (new)
Article19a Respect to Privacy Nothing in this Regulation shall be interpreted as a requirement to 1. break cryptography; 2. scan content on users’ devices; 3. restrict anonymous access to online services and software applications.
2023/07/28
Committee: LIBE
Amendment 1698 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1) concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
2023/07/28
Committee: LIBE
Amendment 1701 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
To that aim, the EU Centre shall compile lists of such technologies, having regard to the requirements of this Regulation and in particular those of Article 10(2) and Article 19a (new).
2023/07/28
Committee: LIBE