BETA

Activities of Hilde VAUTMANS related to 2020/0259(COD)

Plenary speeches (1)

Use of technologies for the processing of data for the purpose of combating online child sexual abuse (temporary derogation from Directive 2002/58/EC) (debate)
2021/07/05
Dossiers: 2020/0259(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online
2020/12/02
Committee: FEMM
Dossiers: 2020/0259(COD)
Documents: PDF(208 KB) DOC(171 KB)
Authors: [{'name': 'Christine ANDERSON', 'mepid': 197475}]

Amendments (27)

Amendment 6 #
Proposal for a regulation
Recital 4
(4) Sexual abuse and sexual exploitation of children constitute serious violations of human rights, in particular of the rights of children to be protected from all forms of violence, abuse and neglect, maltreatment or exploitation, including sexual abuse, as provided for by the 1989 United Nations Convention on the Rights of the Child and by the Charter. Digitisation has brought about many benefits for society and the economy, but also challenges including an increase of child sexual abuse online. The protection of children online is one of the Union's priorities. On 24 July 2020, the Commission adopted an EU strategy for a more effective fight against child sexual abuse9 (“the Strategy”), which aims to provide an effective response, at Union level, to the crime of child sexual abuse. _________________ 9 Communication from the ComIn addition, the Istanbul Convention recognises that girls are often exposed to serious forms of gender-based violence including cyberviolence. Digitisation has brought about many benefits for society and the economy, but also challenges, notably increased child sexual abuse ans child sexual exploitation online, which has been exacerbated during the COVID- 19 pandemic, resulting from broader access to potential victims and a sharp rise in the exchange of child sexual abuse material between child sexual offenders. There is also a growing number of cases of grooming during the COVID-19 pandemic, including an increase of self- generated content. Moreover, the increased misuse of privacy-enhancing technologies by offenders to disguise their horrendous actions has made it more difficult for law-enforcement authorities to prevent, detect, investigate and prosecute child sexual exploitation online. According to Europol, the proliferation of anonymisation tools and the higher amount of child sexual abuse material may also lead to a higher risk of repeat victimissation to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, EU strategy for a more effective fight against child sexual abuse, 24.7.2020 COM(2020) 607 final8a. The protection of children online is one of the Union's priorities as they are the most vulnerable in our society and not able to defend themselves. _________________ 8aEuropol report "Exploiting isolation: Offenders and victims of online child sexual abuse during the Covid- 19pandemic", published on 19 June 2020.
2020/11/13
Committee: FEMM
Amendment 7 #
Proposal for a regulation
Recital 4 a (new)
(4 a) Girls and young women are particularly exposed to the risks of sexual abuse, as well as sexual exploitation and account for the overwhelming majority of cases of child sexual abuse online. According to THORN and the Canadian Centre for Child Protection 80% of the children victim of sexual abuse were girls. Figures from a 2019 report from INHOPE show that 91 % of victims were girls, 7 % were boys and the medianage of victims is decreasing with 92 % of victims under the age of 13.According to End Child Prostitution, Child Pornography & Trafficking of Children for Sexual Purposes (ECPAT) international report from 2017 child sexual offenders are predominantly male10a, which is relevant when it comes to the definition of key indicators. It is therefore important that girls and boys have access to safe, accessible and age appropriate channels to report the abuse without fear, in particular when the abuser is in the inner circle of the victim, since in such instances the reporting is low. _________________ 10aECPAT Journal “End Child Sexual Exploitation international report”, published in April 2017; https://www.ecpat.org/wp- content/uploads/2017/04/Journal_No12- ebook.pdf
2020/11/13
Committee: FEMM
Amendment 8 #
Proposal for a regulation
Recital 4 b (new)
(4 b) On 24 July 2020, the Commission adopted an EU strategy for a more effective fight against child sexual abuse9b (“the Strategy”), which aims to provide an effective response, at Union level, to the crime of child sexual abuse with due regard to different forms of sexual abuse experienced by girls and boys. As part of the Strategy, the Commission announced that it will propose sector-specific legislation including “clear mandatory obligations to detect and report child and young girls sexual abuse online to bring more clarity and certainty to the work of both law enforcement and relevant actors in the private sector to tackle online abuse”. Nevertheless the strategy, there is a great need for preventive measures and a more targeted approach to take into account the specific circumstances and needs of various vulnerable groups of children, in particular girls. _________________ 9b Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, EU strategy fora more effective fight against child sexual abuse, 24.7.2020 COM(2020)0607 final.
2020/11/13
Committee: FEMM
Amendment 9 #
Proposal for a regulation
Recital 5
(5) Number-independent communication services have a major role to play in detecting cases of child sexual abuse online and in removing at source child sexual abuse material from their networks to avoid further victimisation as every new visualisation of the material is harmful for the victim. Underaged children must have access to safe, accessible and age appropriate channels to report the abuse without fear, in particular when the abuser is in the inner circle of the victim. Certain providers of number- independent interpersonal communications services, such as webmail and messaging services, are already using specific technologies to detect and report child sexual abuse online to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, or to remove and child exploitation, or to detect, remove and report child sexual abuse material in their services, on a voluntary basis. To enable the identification of the child victims and to properly identify detection errors by the providers, all the instances of possible child sexual abuse online should be reported to law enforcement authorities and to organisations acting in the public interest against child sexual abuse. Those organisations refer to national hotlines for reporting child sexual abuse material, as well as to organisations whose purpose is to reduce child sexual exploitation, and prevent child victimisation, located both within the Union and in third countries. Collectively, those voluntary activities play a valuable role in enabling the identification and rescue of victims, and reducing the further dissemination of child sexual abuse ans child sexual exploitation material, while also contributing to the identification and investigation of offenders, and the prevention of child sexual abuse and child sexual exploitation offences.
2020/11/13
Committee: FEMM
Amendment 10 #
Proposal for a regulation
Recital 6
(6) Until 20 December 2020, the processing of personal data by providers of number-independent interpersonal communications services by means of voluntary measures for the purpose of detecting and reporting child sexual abuse online and removing child sexual abuse and child sexual exploitation material is governed by Regulation (EU) 2016/679.
2020/11/13
Committee: FEMM
Amendment 11 #
Proposal for a regulation
Recital 7
(7) Directive 2002/58/EC does not contain any specific provisions concerning the processing of personal and other data in connection with the provision of electronic communication services for the purpose of detecting and reporting child sexual abuse online and removing child sexual abuse material. However, pursuant to Article 15(1) of Directive 2002/58/EC, Member States may adopt legislative measures to restrict the scope of the rights and obligations provided for in, inter alia, Articles 5 and 6 of that Directive, which concern confidentiality of communications and traffic data, for the purpose of prevention, investigation, detection and prosecution of criminal offences linked to child sexual abuse. In the absence of such national legislative measures, and pending the adoption of a new longer-term legal framework to tackle child sexual abuse effectively at Union level as announced in the Strategy, there would be no legal basis for providers of number-independent interpersonal communications services to continue to detect and report child sexual abuse online and removeto detect, remove and report child sexual abuse material in their services beyond 21 December 2020.
2020/11/13
Committee: FEMM
Amendment 12 #
Proposal for a regulation
Recital 8
(8) This Regulation therefore provides for a temporary derogation from Article 5(1) and Article 6 of Directive 2002/58/EC, which protect the confidentiality of communications and traffic data. Voluntary measures by providers offering number-independent interpersonal communications services in the Union applied for the sole purpose of detecting and reporting child sexual abuse online and detecting, removing and reporting child sexual abuse material therefore become subject to the safeguards and conditions set out in this Regulation. Since Directive 2002/58/EC was adopted on the basis of Article 114 of the Treaty on the Functioning of the European Union, it is appropriate to adopt this Regulation on the same legal basis. Moreover, not all Member States have adopted legislative measures at national level to restrict the scope of the rights and obligations provided for in those provisions in accordance with Article 15(1) of Directive 2002/58/EC, and the adoption of such measures involves a significant risk of fragmentation likely to negatively affect the internal market and the protection of fundamental rights, notably the rights of children who fall victim to child sexual abuse online across the Union.
2020/11/13
Committee: FEMM
Amendment 13 #
Proposal for a regulation
Recital 11
(11) Since the sole objective of this Regulation is to enable the continuation of certain existing activities aimed at combating child sexual abuse online, the derogation provided for by this Regulation should be limited to well-established technology that is regularly used by number-independent interpersonal communications services for the purpose of detecting and reporting child sexual abuse online and removing child sexual abuse material before the entry into force of this Regulation. The reference to the technology includes where necessary any human review directly relating to the use of the technology and overseeing it. The use of the technology in question should therefore be common in the industry, without it necessarily being required that all providers use the technology and without precluding the further evolution of the technology in a privacy-friendly manner. In this respect, it should be immaterial whether or not a particular provider that seeks to rely on this derogation itself already uses such technology on the date of entry into force of this Regulation. The types of technologies deployed should be the least privacy-intrusive in accordance with the state of the art in the industry and should not include systematic filtering and scanning of communications containing text but only look into specific communications in case of concrete elements of suspicion of. The technologies deployed must not be able to understand the content of the communications but solely be able to detect patterns of possible child sexual abuse.
2020/11/13
Committee: FEMM
Amendment 14 #
Proposal for a regulation
Recital 11
(11) Since the sole objective of this Regulation is to enable the continuation of certain existing activities aimed at combating child sexual abuse online, the derogation provided for by this Regulation should be limited to well-established technology that is regularly used by number-independent interpersonal communications services for the purpose of detecting and reporting child sexual abuse online and removing child sexual abuse material before the entry into force of this Regulation. The reference to the technology includes where necessary any human review directly relating to the use of the technology and overseeing it. The use of the technology in question should therefore be common in the industry, without it necessarily being required that all providers use the technology and without precluding the further evolution of the technology in a privacy-friendly manner. In this respect, it should be immaterial whether or not a particular provider that seeks to rely on this derogation itself already uses such technology on the date of entry into force of this Regulation. The types of technologies deployed should be the least privacy-intrusive in accordance with the state of the art in the industry and should not include systematic filtering and scanning of communications containing text but only look into specific communications in case of concrete elements of suspicion of child sexual abuse.
2020/11/13
Committee: FEMM
Amendment 16 #
Proposal for a regulation
Recital 14
(14) In order to ensure transparency and accountability in respect of the activities undertaken pursuant to the derogation, the providers should publish reports on an annual basis on the processing falling within the scope of this Regulation, including on the type and volumes of data processed, number of cases identifiedof child sexual abuse identified with gender- disaggregated data, when possible, measures applied to select and improve key indicators, the numbers and ratios of errors (false positives) of the different technologies deployed, measures applied to limit the error rate and the error rate achieved, the retention policy and the data protection safeguards applied.
2020/11/13
Committee: FEMM
Amendment 17 #
Proposal for a regulation
Article 1 – paragraph 1
This Regulation lays down temporary and strictly limited rules derogating from certain obligations laid down in Directive 2002/58/EC, with the sole objective of enabling providers of number-independent interpersonal communications services to continue the use ofuse technologies for the processing of personal and other data to the extent necessary and proportionate to detect and report child sexual abuse online and detect, report and remove child sexual abuse material on their services.
2020/11/13
Committee: FEMM
Amendment 18 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point a
(a) material constituting child pornography as defined in Article 2, point (c), of Directive 2011/93/EU of the European Parliament and of the Council;deleted
2020/11/13
Committee: FEMM
Amendment 20 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point a a (new)
(a a) ‘solicitation’as: (i) the proposal by an adult to meet a child who has not reached the age of sexual consent, for the purpose of committing any of the offences referred to in Article 3(4) and Article 5(6) of Directive 2011/93/EU; (ii) an attempt to commit the offences provided for in Article 5(2) and (3) by an adult soliciting a child who has not reached the age of sexual consent to provide child pornography depicting that child.
2020/11/13
Committee: FEMM
Amendment 21 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point b
(b) solicitation of children for the purpose of engaging in sexual activities with a child or of producing child pornography by any of the following: (i) luring the child by means of offering gifts or other advantages; (ii) threatening the child with a negative consequence likely to have a significant impact on the child; (iii) presenting the child with pornographic materials or making them available to the child .deleted
2020/11/13
Committee: FEMM
Amendment 22 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point c
(c) ‘pornographic performance’ as defined in Article 2(e) of Directive 2011/93/EU, including revenge porn.
2020/11/13
Committee: FEMM
Amendment 23 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 – point c a (new)
(c a) ‘sex extortion’
2020/11/13
Committee: FEMM
Amendment 24 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 a (new)
(2 a) ‘child’means any person below the age of sexual consent;
2020/11/13
Committee: FEMM
Amendment 25 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2 b (new)
(2 b) ‘child sexual abuse material’ means: (a) material constituting child pornography as defined in Article 2, point (c), of Directive 2011/93/EU of the European Parliament and of the Council; (b) material constituting ‘child prostitution’ as defined in Article 2, point (d), of Directive 2011/93/EU of the European Parliament and of the Council.
2020/11/13
Committee: FEMM
Amendment 26 #
Proposal for a regulation
Article 3 – paragraph 1 – introductory part
The specific obligations set out in Article 5(1) and Article 6 of Directive 2002/58/EC shall not apply to the processing of personal and other data in connection with the provision of number-independent interpersonal communications services strictly necessary for the use of technology for the sole purpose of detecting and removing child sexual abuse material and detecting or reporting child sexual abuse online or reporting both to law enforcement authorities and to organisations acting in the public interest against child sexual abuse, provided that:
2020/11/13
Committee: FEMM
Amendment 27 #
Proposal for a regulation
Article 3 – paragraph 1 – point a
(a) the processing is proportionate and limited to well-established technologies regularly used by providers of number- independent interpersonal communications services for that purpose before the entry into force of this Regulation, and that are in accordance with the state of the art used in the industry and are the least privacy- intrusive;
2020/11/13
Committee: FEMM
Amendment 28 #
Proposal for a regulation
Article 3 – paragraph 1 – point d
(d) the processing is limited to what is strictly necessary for the purpose of detection and reporting of child sexual abuse online and detection, reporting and removal of child sexual abuse material and, unless. Where no child sexual abuse online has been detected and confirmed as such, is erased immediately;the relevant data shall be retained solely for the following purpose and only for the time period necessary:
2020/11/13
Committee: FEMM
Amendment 29 #
Proposal for a regulation
Article 3 – paragraph 1 – point d – indent 1 (new)
- for its reporting and to respond to proportionate requests by law enforcement and other relevant public authorities;
2020/11/13
Committee: FEMM
Amendment 30 #
Proposal for a regulation
Article 3 – paragraph 1 – point d – indent 2 (new)
- for the blocking of the concerned user’s account;
2020/11/13
Committee: FEMM
Amendment 31 #
Proposal for a regulation
Article 3 – paragraph 1 – point d – indent 3 (new)
- in relation to data reliably identified as child pornography, for the creation of a unique, non-reconvertible digital signature (‘hash’);
2020/11/13
Committee: FEMM
Amendment 32 #
Proposal for a regulation
Article 3 – paragraph 1 – point d – indent 4 (new)
- for proceedings of administrative or judicial review or remedy.
2020/11/13
Committee: FEMM
Amendment 33 #
Proposal for a regulation
Article 3 – paragraph 1 – point e
(e) the provider annually publishes a report on its related processing, including on the type and volumes of data processed, number of cases identifiedof child sexual abuse and child sexual abuse material identified, reported and removed, showing gender disaggregated data, when possible, measures applied to select and improve key indicators, numbers and ratios of errors (false positives) of the different technologies deployed, measures applied to limit the error rate and the error rate achieved, the retention policy and the data protection safeguards applied.
2020/11/13
Committee: FEMM
Amendment 34 #
Proposal for a regulation
Article 3 – paragraph 2
As regards point (d), where child sexual abuse online has been detected and confirmed as such, the relevant data may be retained solely for the following purposes and only for the time period necessary: — for its reporting and to respond to proportionate requests by law enforcement and other relevant public authorities; — for the blocking of the concerned user’s account; — in relation to data reliably identified as child pornography, for the creation of a unique, non-reconvertible digital signature (‘hash’).deleted
2020/11/13
Committee: FEMM