24 Amendments of Maria GRAPINI related to 2022/0155(COD)
Amendment 237 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal marketEU Member States.
Amendment 284 #
Proposal for a regulation
Recital 1
Recital 1
(1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), and to protect society at large. Users of such services offered in the Union must have the assurance that they are protected against any type of sexual abuse in the online environment, and should be able to trust that the services concerned can be used safely, especially by children.
Amendment 286 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 2
Article 3 – paragraph 2 – point b – indent 2
— concrete measures taken to enforce such prohibitions and restrictions;
Amendment 291 #
Proposal for a regulation
Recital 2
Recital 2
(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures tohat offer users greater security and minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services.
Amendment 293 #
Proposal for a regulation
Recital 2
Recital 2
(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable, reliable and tangible measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and, combat and report such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services.
Amendment 295 #
Proposal for a regulation
Recital 3
Recital 3
(3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level, offering internet users greater security in terms of combating online child sex abuse.
Amendment 304 #
Proposal for a regulation
Recital 4
Recital 4
(4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and, combat and report child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology- neutral and future- proof manner, so as not to hamper innovation.
Amendment 324 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonablconcrete mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
Amendment 342 #
Proposal for a regulation
Article 4 – paragraph 2 – point a
Article 4 – paragraph 2 – point a
(a) effective and efficient in mitigating the identified risk;
Amendment 354 #
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall immediately take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures.
Amendment 364 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
Article 5 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall transmit, by three months fromwithin 30 days of the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 368 #
Proposal for a regulation
Article 5 – paragraph 2
Article 5 – paragraph 2
2. Within three month60 days after receiving the report, the Coordinating Authority of establishment shall assess it and determine, on that basis and taking into account any other relevant information available to it, whether the risk assessment has been carried out or updated and the mitigation measures have been taken in accordance with the requirements of Articles 3 and 4.
Amendment 518 #
Proposal for a regulation
Article 10 – paragraph 4 – point c
Article 10 – paragraph 4 – point c
(c) ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, human intervention;
Amendment 557 #
Proposal for a regulation
Article 14 – paragraph 2
Article 14 – paragraph 2
2. The provider shall execute the removal order as soon as possible and in any event within no more than 24 hours of receipt thereof.
Amendment 701 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 3
Article 3 – paragraph 3 – subparagraph 3
Amendment 706 #
Proposal for a regulation
Article 3 – paragraph 4 – subparagraph 2 – introductory part
Article 3 – paragraph 4 – subparagraph 2 – introductory part
Subsequently, the provider shall update the risk assessment where necessary and at least once every three years from the date at which it last carried out or updated the risk assessment. However:
Amendment 789 #
Proposal for a regulation
Article 4 – paragraph 2 – point b
Article 4 – paragraph 2 – point b
(b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technologicfinancial capabilities and the number of users;
Amendment 834 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
Article 5 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall transmit, by threone months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 841 #
Proposal for a regulation
Article 5 – paragraph 2
Article 5 – paragraph 2
2. Within threone months after receiving the report, the Coordinating Authority of establishment shall assess it and determine, on that basis and taking into account any other relevant information available to it, whether the risk assessment has been carried out or updated and the mitigation measures have been taken in accordance with the requirements of Articles 3 and 4.
Amendment 850 #
Proposal for a regulation
Article 5 – paragraph 6 – subparagraph 1 (new)
Article 5 – paragraph 6 – subparagraph 1 (new)
Service providers, the EU Centre and all European and national authorities managing the personal data of children or adults are required to comply with the GDPR.
Amendment 920 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point d
Article 7 – paragraph 3 – subparagraph 1 – point d
(d) invite the EU Centre to provide its opinion on the draft request, within a time period of fourtwo weeks from the date of receiving the draft request.
Amendment 1116 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 1
Article 9 – paragraph 3 – subparagraph 1
Where the period of application of the detection order exceeds 12six months, or sixthree months in the case of a detection order concerning the solicitation of children, the Coordinating Authority of establishment shall require the provider to report to it on the execution of the detection order at least once, halfway through the period of application.
Amendment 1177 #
Proposal for a regulation
Article 10 – paragraph 4 – point c
Article 10 – paragraph 4 – point c
(c) ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, human intervention; ensure that any malfunctions or defects in the technologies used are remedied within a matter of hours;
Amendment 1194 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point a
Article 10 – paragraph 5 – subparagraph 1 – point a
(a) the fact that it operates technologies to detect online child sexual abuse to execute the detection order, the ways in which it operates those technologies and the impact on the confidentiality of users’ communications and on personal data protection;