19 Amendments of Dragoş TUDORACHE related to 2020/0259(COD)
Amendment 62 #
Proposal for a regulation
Citation 4 a (new)
Citation 4 a (new)
having regard to the principles established by the 1989 United Nations Convention on the Rights of the Child and its optional protocol on sale of children, child prostitution and child pornography,
Amendment 64 #
Proposal for a regulation
Recital 3
Recital 3
(3) In accordance with Article 6(1) of the Treaty on European Union, the Union recognises the rights, freedoms and principles set out in the Charter of Fundamental Rights of the European Union. Article 7 of the Charter of Fundamental Rights of the European Union (“the Charter”) protects the fundamental right of everyone to the respect for his or her private and family life, home and communications, which includes the confidentiality of communications. Article 8 of the Charter contains the right to protection of personal data. Article 3(1) of the 1989 United Nations Convention on the Rights of the Child ("UNCRC") and Article 24(2) of the Charter provides that, in all actions relating to children, whether taken by public authorities or private institutions, the child’s best interests must be a primary consideration. Articles 3(3) of the UNCRC and 24(1) of the Charter furthermore evoke the right of children to protection and care as is necessary for their well-being.
Amendment 74 #
Proposal for a regulation
Recital 5
Recital 5
(5) Certain providers of number- independent interpersonal communications services, such as webmail and messaging services, are already using specific technologies to detect and report child sexual abuse online to law enforcement authorities and to organisations acting in the public interest against child sexual abuse and child sexual exploitation, or to remove child sexual abuse and child sexual exploitation material, on a voluntary basis. Those organisations refer to national hotlines for reporting child sexual abuse and child sexual exploitation material, as well as to organisations whose purpose is to reduce child sexual abuse and child sexual exploitation, and prevent child victimisation, located both within the Union and in third countries. Collectively, those voluntary activities play a valuable role in enabling the identification and rescue of victims, and reducing the further dissemination of child sexual abuse materialand child sexual exploitation material, which constitutes a gross violation of the right to privacy of the child, while also contributing to the identification and investigation of offenders, and the prevention of child sexual abuse and child sexual exploitation offences.
Amendment 109 #
Proposal for a regulation
Recital 11
Recital 11
(11) Since the sole objective of this Regulation is to enable the continuation of certain existing activities aimed at combating child sexual abuse online, the derogation provided for by this Regulation should be limited to well-established technology that is regularly used by number-independent interpersonal communications services for the purpose of detecting and reporting child sexual abuse online and removing child sexual abuse material before the entry into force of this Regulation. The reference to the technology includes where necessary any human review directly relating to the use of the technology and overseeing it. The use of the technology in question should therefore be common in the industry, without it necessarily being required that all providers use the technology and without precluding the further evolution of the technology in a privacy-friendly manner. In this respect, it should be immaterial whether or not a particular provider that seeks to rely on this derogation itself already uses such technology on the date of entry into force of this Regulation. The types of technologies deployed should be the least privacy-intrusive in accordance with the state of the art in the industry and should not include systematic filtering and scanning of communications containing text but only look into specific communications in case of concrete elements of suspicion of. The technologies deployed should not be able to understand the content of the communications but solely be able to detect patterns of possible child sexual abuse.
Amendment 123 #
Proposal for a regulation
Recital 16
Recital 16
(16) This Regulation restricts the right to protection of the confidentiality of communications and derogates from the decision taken in Directive (EU) 2018/1972 to subject number-independent interpersonal communications services to the same rules as all other electronic communications services as regards privacy. The period of application of this Regulation should, therefore, be limited until 31 December 2025, that is to say for a time period reasonably required for the adoption of a new long-term legal framework, with more elaborate safeguards. This new legal framework will provide a new legal basis and mandatory requirements for companies to detect and report child sexual abuse online and remove child sexual abuse and child sexual exploitation material online. The new legal framework should also incorporate more elaborate safeguards, as well as the creation of a European Centre to prevent and counter child sexual abuse, to improve transparency and accountability. In case the long-term legislation is adopted and will enter into force before that date, that legislation should repeal this Regulation.
Amendment 136 #
Proposal for a regulation
Recital 18 a (new)
Recital 18 a (new)
(18a) The images and videos depicting child sexual abuse material concern the child's intimacy, and are therefore special categories of data whose processing to enable its dissemination is unlawful. Companies should not be prevented from taking measures to prevent that processing and ensure that their services are not abused for the purpose of disseminating images and videos of child sexual abuse.
Amendment 143 #
Proposal for a regulation
Article 1 – paragraph 1
Article 1 – paragraph 1
This Regulation lays down temporary and strictly limited rules derogating from certain obligations laid down in Directive 2002/58/EC, with the sole objective of enabling providers of number-independent interpersonal communications services to continue the use of technologies for the processing of personal and other data to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse and child sexual exploitation material on their services.
Amendment 149 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point b – introductory part
Article 2 – paragraph 1 – point 2 – point b – introductory part
(b) solicitation of children for the purpose of engaging in sexual activities with a child or of producing child pornography by any of the followingas:
Amendment 151 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point b – point i
Article 2 – paragraph 1 – point 2 – point b – point i
(i) luring the child by means of offering gifts or other advantagesthe proposal by an adult to meet a child for the purpose of committing any of the offences referred to in Articles 3(4) and 5(6) of Directive 2011/93/EU;
Amendment 152 #
(ii) threatening the child with a negative consequence likely to have a significant impact onan attempt to commit the offences provided for in Article 5(2) and (3) of Directive 2011/93/EU by an adult soliciting a child to provide child pornography depicting theat child;
Amendment 153 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point b – point iii
Article 2 – paragraph 1 – point 2 – point b – point iii
Amendment 156 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point c a (new)
Article 2 – paragraph 1 – point 2 – point c a (new)
(ca) 'child prostitution' as defined in Article 2(d) of Directive 2011/93/EU.
Amendment 167 #
Proposal for a regulation
Article 3 – paragraph 1 – point a
Article 3 – paragraph 1 – point a
(a) the processing is proportionate and limited to well-established technologies regularly used by providers of number- independent interpersonal communications services for that purpose before the entry into force of this Regulation, and that are in accordance with the state of the art used in the industry and are the least privacy- intrusive;
Amendment 170 #
Proposal for a regulation
Article 3 – paragraph 1 – point a a (new)
Article 3 – paragraph 1 – point a a (new)
(aa) the provider clarifies, in its annual reporting, the legal basis for the processing of personal data pursuant to Regulation (EU) 2016/679;
Amendment 172 #
Proposal for a regulation
Article 3 – paragraph 1 – point b
Article 3 – paragraph 1 – point b
(b) the technology used is in itself sufficiently reliable in that it limits to the maximum extent possible the rate of errors regarding the detection of content representing child sexual abuse, and where such occasional errors occur, their consequences are rectified without delayhild sexual abuse online;
Amendment 175 #
(ba) the provider puts in place redress mechanisms to ensure that users who believe that they have been wrongfully included in a report of child sexual abuse online can refer their cases to the provider for review, and, where an error has occurred, its consequences are rectified without delay;
Amendment 176 #
Proposal for a regulation
Article 3 – paragraph 1 – point b b (new)
Article 3 – paragraph 1 – point b b (new)
(bb) all the instances of possible child sexual abuse online, after accurate human review, are reported to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, to enable the identification of the child victims and as a safeguard to identify detection errors by the providers;
Amendment 202 #
Proposal for a regulation
Article 3 a (new)
Article 3 a (new)
Article 3a Obligation for a data protection impact assessment In order to rely on the derogation provided for by this Regulation, providers of number-independent interpersonal communications services shall conduct a data protection impact assessment where required by Article 35 of Regulation (EU) 2016/679 where: (a) processing falling within the requirements of Article 3 is already underway, by ... [three months after the date of entry into force of this Regulation]; or (b) processing falling within the requirements of Article 3 is not already underway, prior to commencing such processing. Point (a) shall not apply where a data protection impact assessment has been conducted prior to the entry into force of this Regulation. Point (a) shall not have the effect of requiring the suspension of such processing while the data protection impact assessment is conducted.
Amendment 207 #
Proposal for a regulation
Article 3 b (new)
Article 3 b (new)
Article 3b Public interest and legitimate interest of providers For the purposes of this Regulation, the detection and reporting of child sexual abuse online and the removal of child sexual abuse material online shall be considered to be (a) a legitimate interest of providers of number-independent interpersonal communications services, within the meaning of point (f) of Article 6(1) of Regulation (EU) 2018/679; and (b) a task carried out in the public interest, within the meaning of point (e) of Article 6(1) of Regulation (EU) 2018/679.