35 Amendments of Moritz KÖRNER related to 2022/0155(COD)
Amendment 192 #
Proposal for a regulation
Recital 17 a (new)
Recital 17 a (new)
(17 a) Fundamental rights in the digital sphere have to be guaranteed to the same extent as in the offline world. Safety and privacy need to be ensured, amongst others through end-to-end encryption in private online communication and the protection of private content against any kind of general or targeted surveillance, be it by public or private actors.
Amendment 274 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2 a End-to-End Encryption and Prohibition on General Monitoring 1. End-to-end encryption is essential to guarantee the security, confidentiality of the communications of users, including those of children. Any restrictions of encryption could lead to abuse by malicious actors. Nothing in this Regulation should be interpreted as prohibiting providers of information society services from providing their services applying end-to-end encryption, restricting or undermining such encryption. Member States should not prevent providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third party access. 2. Nothing in this Regulation should undermine the prohibition of general monitoring under EU law.
Amendment 280 #
Proposal for a regulation
–
–
The European Parliament rejects the Commission proposal (COM(2022)0209).
Amendment 282 #
Proposal for a regulation
Citation 1
Citation 1
Having regard to the Treaty on the Functioning of the European Union, and in particular Article 16 and Article 114 thereof,
Amendment 287 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
— functionalities enabling age verificationprotection of children and preventing online child sexual abuse;
Amendment 349 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
Amendment 370 #
Proposal for a regulation
Recital 22
Recital 22
(22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
Amendment 374 #
Proposal for a regulation
Article 5 a (new)
Article 5 a (new)
Amendment 375 #
Proposal for a regulation
Article 6
Article 6
Amendment 390 #
Proposal for a regulation
Chapter II – Section 2
Chapter II – Section 2
Amendment 392 #
Proposal for a regulation
Article 7
Article 7
Amendment 417 #
Proposal for a regulation
Recital 30
Recital 30
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 458 #
Amendment 485 #
Proposal for a regulation
Article 9
Article 9
Amendment 496 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market. by persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
Amendment 503 #
Proposal for a regulation
Article 10
Article 10
Amendment 530 #
Proposal for a regulation
Article 11
Article 11
Amendment 553 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse materialRemoval orders shall be issued by judicial authorities in line with Article 9 on Orders to act against illegal content of the Regulation (EU) 2022/2065.
Amendment 556 #
Proposal for a regulation
Article 14 – paragraph 2
Article 14 – paragraph 2
Amendment 594 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) “person suspected of being involved in child sexual abuse” means an identified individual person about whom verifiable adequate evidence exists, which gives rise to the suspicion that that person has committed a child sexual abuse offence, attempted to commit a child sexual abuse offence, or prepared by committing a criminal offence to commit a child sexual abuse offence;
Amendment 596 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
Article 2 – paragraph 1 – point q b (new)
(qb) 'person disqualified from exercising activities involving children' means an identified individual person, who, in line with Article 10 of Directive 2011/93/EU, is temporarily or permanenently disqualified from exercising activities involving direct and regular contacts with children;
Amendment 807 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take threasonable and proportionate mitigation measures.
Amendment 861 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessinginform the software application provider concerned and the EU Centre about the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;
Amendment 868 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
Article 6 – paragraph 1 – point c
Amendment 870 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
1a. Providers of software applications who have been informed that in relation to their software applications a significant risk of use of the service concerned for the purpose of the solicitation of children has been identified, shall take reasonable and proportionate mitigation measures.
Amendment 890 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service in the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
Amendment 892 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service.
Amendment 951 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
Article 7 – paragraph 4 – subparagraph 1 – introductory part
The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority or independent administrative authority shall issue the detection order where it considers that the following conditions are met:
Amendment 1128 #
Proposal for a regulation
Article 10 – paragraph 1
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that have received a detection order concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
Amendment 1266 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1)courts identified as constituting child sexual abuse material.
Amendment 1269 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse material.
Amendment 1294 #
Proposal for a regulation
Chapter II – Section 5
Chapter II – Section 5
Amendment 1332 #
Proposal for a regulation
Article 19 a (new)
Article 19 a (new)
Article19a Respect to Privacy Nothing in this Regulation shall be interpreted as a requirement to 1. break cryptography; 2. scan content on users’ devices; 3. restrict anonymous access to online services and software applications.
Amendment 1698 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1) concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
Amendment 1701 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
Article 50 – paragraph 1 – subparagraph 2
To that aim, the EU Centre shall compile lists of such technologies, having regard to the requirements of this Regulation and in particular those of Article 10(2) and Article 19a (new).