BETA

Activities of Paul TANG related to 2022/0155(COD)

Shadow reports (1)

REPORT on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
2023/11/16
Committee: LIBE
Dossiers: 2022/0155(COD)
Documents: PDF(1 MB) DOC(555 KB)
Authors: [{'name': 'Javier ZARZALEJOS', 'mepid': 197606}]

Amendments (206)

Amendment 337 #
Proposal for a regulation
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
2023/07/28
Committee: LIBE
Amendment 357 #
Proposal for a regulation
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orderwarrants. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place innumber-independent publicly available interpersonal communications services, it should only be possible to address detection orderwarrants to providers of such services.
2023/07/28
Committee: LIBE
Amendment 358 #
Proposal for a regulation
Recital 20 a (new)
(20a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any weakening of encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be prohibiting or weakening end-to-end encryption or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 359 #
Proposal for a regulation
Recital 20 b (new)
(20b) The use of end-to-end encryption should be promoted and, where necessary, be mandatory in accordance with the principles of security and privacy by design. Member States should not impose any obligation on encryption providers, on providers of electronic communications services or on any other organisations, at any level of the supply chain, that would result in the weakening of the security of their networks and services, such as the creation or facilitation of backdoors or any other functionality allowing disclosure of communications content to third parties.
2023/07/28
Committee: LIBE
Amendment 360 #
Proposal for a regulation
Recital 20 c (new)
(20c) The act of breaking encryption refers to the act of defeating or bypassing the encryption protocol used to secure a communication. Any access by any third- party that was not meant to access, read or edit the content of that communication that was supposed to be private and secure should be considered as undermining encryption.
2023/07/28
Committee: LIBE
Amendment 361 #
Proposal for a regulation
Recital 20 d (new)
(20d) The technologies used for the purpose of executing detection warrants should be in accordance with the state of the art in the industry and are the least privacy-intrusive, including with regard to the principle of data protection by design and by default pursuant to Regulation (EU) 2016/679.
2023/07/28
Committee: LIBE
Amendment 364 #
Proposal for a regulation
Recital 21
(21) Furthermore, as parts of those limits and safeguards, detection orderwarrants should only be issued after a diligent and objective assessment leading to the finding of a significant risk of theby a judicial authority and only with the purpose to detect known online child sexual abuse material related to a specific serdevice concerned being misused for a given type of online child sexual abuse covered by this Regulationor user account, where there is a reasonable suspicion such content is stored on that device or in that user account. One of the main elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection orderexistence of evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 368 #
Proposal for a regulation
Recital 22
(22) However, the finding of such a significant riskexistence of evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse should in itself be insufficient to justify the issuance of a detection orderwarrant, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orderwarrants can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousnactual or potential implications for the rights and legitimate interests of any potential negative consequences for other parties affecll parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Chartedr. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
2023/07/28
Committee: LIBE
Amendment 495 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal marketorder to contribute to the proper functioning of the internal market and to create a safe, predictable and trusted online environment where fundamental rights enshrined in the Charter are effectively protected.
2023/07/28
Committee: LIBE
Amendment 502 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of number- independent interpersonal communication services to detect and report online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 508 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point c
(c) obligations on providers of hosting services to remove or disable access to child sexual abuse material on their services;
2023/07/28
Committee: LIBE
Amendment 511 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d
(d) obligations on providers of internet access services to disable access to child sexual abuse material;deleted
2023/07/28
Committee: LIBE
Amendment 517 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point e
(e) rules on the implementation and enforcement of this Regulation, including as regards the designation and functioning of the competent authorities of the Member States, the EU Centre on Child Sexual Abuse established in Article 40 (‘EU Centre’) and cooperation and transparency.;
2023/07/28
Committee: LIBE
Amendment 518 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point e a (new)
(ea) rules on the designation, functioning, cooperation, transparency and powers of the EU Centre on Child Sexual Abuse established in Article 40 (‘EU Centre’);
2023/07/28
Committee: LIBE
Amendment 524 #
Proposal for a regulation
Article 1 – paragraph 3 – point d
(d) Regulation (EU) 2016/679, Directive 2016/680, Regulation (EU) 2018/1725, and, subject to paragraph 4 of this Article, Directive 2002/58/EC.
2023/07/28
Committee: LIBE
Amendment 529 #
Proposal for a regulation
Article 1 – paragraph 3 – point d a (new)
(da) Regulation (EU) …/… [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act);
2023/07/28
Committee: LIBE
Amendment 532 #
Proposal for a regulation
Article 1 – paragraph 3 a (new)
3a. This Regulation shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption, or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 534 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
3b. This Regulation shall not undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 539 #
Proposal for a regulation
Article 1 – paragraph 4
4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the execution of the detection orderswith the sole objective of enabling a provider of hosting services, a provider of number-independent interpersonal communications services or a provider of an artifical intelligence system to use specific technologies for the processing of personal data to the extent strictly necessary to detect and report online child sexual abuse and remove child sexual abuse material on their services, following a detection warrant issued in accordance with Section 2 of Chapter 1 of this Regulation.
2023/07/28
Committee: LIBE
Amendment 542 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
4a. This Regulation does not apply to audio communications.
2023/07/28
Committee: LIBE
Amendment 549 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
(ba) ‘number-independent interpersonal communications service’ means a publicly available service as defined in Article 2, point 7, of Directive (EU) 2018/1972;
2023/07/28
Committee: LIBE
Amendment 554 #
Proposal for a regulation
Article 2 – paragraph 1 – point e a (new)
(ea) ‘artificial intelligence system’ means software as defined in Article 3(1) of Regulation (EU) …/… [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act);
2023/07/28
Committee: LIBE
Amendment 560 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point ii
(ii) an number-independent interpersonal communications service;
2023/07/28
Committee: LIBE
Amendment 563 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iii
(iii) a software applications store;deleted
2023/07/28
Committee: LIBE
Amendment 568 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
(iva) an artificial intelligence system;
2023/07/28
Committee: LIBE
Amendment 577 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 17 years;deleted
2023/07/28
Committee: LIBE
Amendment 595 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) ‘child survivor’ means a person as defined in Article 2(1) point (a) of Directive 2011/93/EU who is below 18 years of age and suffered child sexual abuse offences;
2023/07/28
Committee: LIBE
Amendment 597 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(qb) 'survivor' means a person as defined in Article 2(1) point (a) of Directive 2011/93/EU who suffered child sexual abuse offences;
2023/07/28
Committee: LIBE
Amendment 599 #
Proposal for a regulation
Article 2 – paragraph 1 – point s
(s) ‘content data’ means data as defined in Article 2, point 10, of Regulation (EU) … [on European Production and Preservation Orders for electronic evidence in criminal matters (…/… e-evidence Regulation)]videos and images in a digital format;
2023/07/28
Committee: LIBE
Amendment 605 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
(wa) ‘hotline’ means an organisation officially recognised by a Member State, other than the reporting channels provided by law enforcement authorities, for receiving anonymous complaints from victims and the public about alleged child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 608 #
Proposal for a regulation
Article -3 (new)
Article-3 Protection of fundamental human rights and confidentiality in communications 1. Nothing in this Regulation shall prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way. 2. Nothing in this Regulation shall undermine the prohibition of general monitoring under Union law or introduce general data retention obligations.
2023/07/28
Committee: LIBE
Amendment 610 #
Proposal for a regulation
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such any serious systemic risk stemming from the functioning and use of their services for the purpose of online child sexual abuse. That risk assessment shall be specific to the services that they offer,ey offer and proportionate to the serious systemic risk considering its severity and probability. To this end, providers subject to an obligation to conduct a risk assessment under Regulation (EU) 2022/2065 may draw on that risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 618 #
Proposal for a regulation
Article 3 – paragraph 1 a (new)
1a. Without prejudice to Regulation (EU) 2022/2065, when conducting the risk assessment, providers of hosting services and providers of interpersonal communications services shall respect and avoid any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity, respect for private and family life, the protection of personal data, freedom of expression and information, including the freedom and pluralism of the media, the prohibition of discrimination, the rights of the child and consumer protection, as enshrined in Articles 1, 7, 8, 11, 21, 24 and 38 of the Charter respectively.
2023/07/28
Committee: LIBE
Amendment 622 #
Proposal for a regulation
Article 3 – paragraph 2 – point a
(a) any previouslyserious systemic risks and identified instances of use of its services for the purpose of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 636 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
– functionalities enabling age verification;deleted
2023/07/28
Committee: LIBE
Amendment 645 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily recognisable, accessible and, age-appropriate and child- and user friendly, including anonymous reporting channels;
2023/07/28
Committee: LIBE
Amendment 649 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
- systems and mechanisms that provide child- and user-friendly resources to ensure that children can seek help swiftly, including information on how to contact national child protection organisations or national law enforcement.
2023/07/28
Committee: LIBE
Amendment 661 #
Proposal for a regulation
Article 3 – paragraph 2 – point d
(d) the manner in which the provider designed and operates the service, including the business model, governance, type of users targeted, and relevant systems and processes, and the impact thereof on that risk;
2023/07/28
Committee: LIBE
Amendment 665 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point i
(i) the extent to which the service is used or is likely to be used bydirectly targeting children;
2023/07/28
Committee: LIBE
Amendment 670 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point ii
(ii) where the service is used bydirectly targeting children, the different age groups of the child users and the risk of solicitation of children in relation to those age groupsren the service is targeting;
2023/07/28
Committee: LIBE
Amendment 674 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – introductory part
(iii) the availability of functionalities creating or reinforcing the serious systemic risk of solicitation of children, including the following functionalities:
2023/07/28
Committee: LIBE
Amendment 676 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 1
– enabling users to search for other users and, in particular, for adult users to search for child users, in particular on services directly targeting children;
2023/07/28
Committee: LIBE
Amendment 679 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 2
– enabling users to establish unsolicited contact with other users directly, in particular through private communicationsand for users to engage and connect with children, in particular on services directly targeting children;
2023/07/28
Committee: LIBE
Amendment 686 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 3
– enabling users to share images or videos with other users, in particular through private communications., in particular on services directly targeting children;
2023/07/28
Committee: LIBE
Amendment 694 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
2a. When providers of hosting services and providers of interpersonal communication services put forward age assurance or age verification system as a mitigation measure, they shell meet the following criteria: a) Protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose; b) Do not collect data that is not necessary for the purpose of age assurance; c) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse; d) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
2023/07/28
Committee: LIBE
Amendment 698 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 1 a (new)
Neither this request nor its subsequent analysis that the EU Centre may perform shall exempt the provider from its obligation to conduct the risk assessment in accordance with paragraphs 1 and 2 of this Article and to comply with other obligations set out in this Regulation.
2023/07/28
Committee: LIBE
Amendment 714 #
Proposal for a regulation
Article 3 – paragraph 5
5. The risk assessment shall include an assessment of any potential remaining risk that, after taking the mitigation measures pursuant to Article 4, the service is used for the purpose of online child sexual abuse.deleted
2023/07/28
Committee: LIBE
Amendment 718 #
Proposal for a regulation
Article 3 – paragraph 6
6. The Commission, in cooperation with Coordinating Authorities, the European Data Protection Board, the Fundamental Rights Agency and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
2023/07/28
Committee: LIBE
Amendment 724 #
Proposal for a regulation
Article 4 – title
4 RSafety-by-design and risk mitigation
2023/07/28
Committee: LIBE
Amendment 726 #
Proposal for a regulation
Article 4 – paragraph -1 (new)
-1. Providers of hosting services and providers of interpersonal communications services shall have mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be online child sexual abuse.This obligation shall not be interpreted as an obligation of general monitoring or generalised data retention. Such mechanisms shall be easy to access, child-friendly, and shall allow for the submission of notices by electronic means. [By 6 months after entry into force] the Commission shall adopt a delegated act laying down design requirements for a uniform identifiable notification mechanism as referred to in this Article, including on the design of a uniform, easily recognisable, icon in the user interface. Providers of hosting services and providers of interpersonal communications services targeting children may implement the design requirements specified in the delegated act referred to in this paragraph.
2023/07/28
Committee: LIBE
Amendment 731 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measuresput in place reasonable, proportionate and targeted mitigation measures, tailored to their services and the serious systemic risk identified pursuant to Article 3, with the aim of mitigating that risk. Such measures shall never entail a general monitoring obligation or generalised data retention obligation and shall include some or all of the following:
2023/07/28
Committee: LIBE
Amendment 736 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a) testing and adapting, through state of the art appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions, including the speed and quality of processing notices and reports related to online child sexual abuse and, where appropriate, the expeditious removal of the content notified;
2023/07/28
Committee: LIBE
Amendment 739 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
(aa) adapting the design, features and functions of their services in order to ensure a high level of privacy, data protection, safety, and security by design and by default, including some or all of the following: (a) limiting users, by default, to establish direct contact with other users, in particular through private communications; (b) limiting users, by default, to directly share images or videos on services; (c) limiting users, by default, to directly share personal contact details with other users, such as phone numbers, home addresses and e-mail addresses, via rules- based matching; (d) limiting users, by default, to create screenshots or recordings within the service; (e) limiting users, by default, to directly reforward images and videos to other users where no consent has been given; (f) allowing parents of a child or a legal representative of a child to make use of meaningful parental controls tools, which protect the confidentiallity of communications of the child; (g) encouraging children, prior to registring for the service, to talk to their parents about how the service works and what parental controls tools are available. Services taking the measures outlined in this point may allow users to revert such measures on an individual level.
2023/07/28
Committee: LIBE
Amendment 763 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communicationrelevant information society services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
2023/07/28
Committee: LIBE
Amendment 767 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
(ca) reinforcing awareness-raising measures and adapting their online interface for increased user information, including child-appropriate information targeted to the risk identified;
2023/07/28
Committee: LIBE
Amendment 772 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
(cb) including clearly visible and identifiable information on the minimum age for using the service;
2023/07/28
Committee: LIBE
Amendment 773 #
Proposal for a regulation
Article 4 – paragraph 1 – point c c (new)
(cc) initiating targeted measures to protect the rights of the child and tools aimed at helping users to indicate child sexual abuse material and helping children to signal abuse or obtain support;
2023/07/28
Committee: LIBE
Amendment 777 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
1a. Providers of hosting services and providers of interpersonal communications services directly targeting children shall implement the design requirements as specified in the delegated act referred to in paragraph -1 and shall take all mitigation measures as outlined in paragraph 1, point (aa), of this Article to minimise this risk. Such services shall allow users to revert mitigation measures on an individual level.
2023/07/28
Committee: LIBE
Amendment 784 #
(a) effective in mitigating the identified serious systemic risk;
2023/07/28
Committee: LIBE
Amendment 788 #
Proposal for a regulation
Article 4 – paragraph 2 – point b
(b) targeted and proportionate in relation to that serious systemic risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technological capabilitielimitations and the number of users;
2023/07/28
Committee: LIBE
Amendment 792 #
Proposal for a regulation
Article 4 – paragraph 2 – point c
(c) applied in a diligent and non- discriminatory manner, having due regard, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected;, in particular the rights to privacy, protection of data and freedom of expression.
2023/07/28
Committee: LIBE
Amendment 805 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify childdifferenciate between child and adult users on their services, enabling them to take the mitigation measures. Age assurances or age verification systems as mitigation measures shall be implemented only if they meet the criteria set in Article 3, paragraph 2a of this Regulation.
2023/07/28
Committee: LIBE
Amendment 809 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measuressafety-by-design measures, including those mentioned in Article 4 paragraph 1 a.
2023/07/28
Committee: LIBE
Amendment 819 #
Proposal for a regulation
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities and, the EU Centre, the European Data Protection Board and the Fundamental Rights Agency, and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used. The European Commission, along with the European Data Protection Board and the Fundamental Rights Agency shall issue guidelines on how providers may implement age verification and age assessment measures, in particular based on selective disclosure of attributes, with full respect for the Charter of Fundamental Rights and Regulation (EU) 2016/679 .
2023/07/28
Committee: LIBE
Amendment 823 #
Proposal for a regulation
Article 4 – paragraph 5 a (new)
5a. Prior to the deployment of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
2023/07/28
Committee: LIBE
Amendment 835 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
(a) the process and the results of the risk assessment conducted or updated pursuant to Article 3, including the assessment of any potential remaining risk referred to in Article 3(5);
2023/07/28
Committee: LIBE
Amendment 848 #
Proposal for a regulation
Article 5 – paragraph 6
6. Providers shall, upon request, transmit the report to the providers of software application stores, insofar as necessary for the assessment referred to in Article 6(2). Where necessary, they may remove confidential information from the reports.deleted
2023/07/28
Committee: LIBE
Amendment 858 #
Proposal for a regulation
Article 6 – paragraph 1 – introductory part
1. Providers of software application stores considered as gatekeepers under the Digital Markets Act (EU) 2022/1925 shall:
2023/07/28
Committee: LIBE
Amendment 859 #
Proposal for a regulation
Article 6 – paragraph 1 – point a
(a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation ofindicate, based on the information provided by the applications developers, if applications contain features that could pose a risk to children;
2023/07/28
Committee: LIBE
Amendment 863 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of childrindicate, based on the information provided by the applications developers, if measures have been taken by the application to mitigate risks for children, and which measures have been taken;
2023/07/28
Committee: LIBE
Amendment 867 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
(c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b).deleted
2023/07/28
Committee: LIBE
Amendment 869 #
Proposal for a regulation
Article 6 – paragraph 1 – point c a (new)
(ca) indicate, based on the information provided by the applications developers, the minimum age for using an application, as set out in the terms and conditions of the provider of the application;
2023/07/28
Committee: LIBE
Amendment 871 #
Proposal for a regulation
Article 6 – paragraph 2
2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3.deleted
2023/07/28
Committee: LIBE
Amendment 872 #
Proposal for a regulation
Article 6 – paragraph 3
3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures.deleted
2023/07/28
Committee: LIBE
Amendment 873 #
Proposal for a regulation
Article 6 – paragraph 4
4. The Commission, in cooperation with Coordinating Authorities and, the EU Centre, the European Data Protection Board and the Fundamental Rights Agency, and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
2023/07/28
Committee: LIBE
Amendment 883 #
Proposal for a regulation
Article 7 – title
Issuance of detection orderwarrants
2023/07/28
Committee: LIBE
Amendment 886 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power toA competent judicial authority may issue, following a request by the competent judicial aCoordinating Authority of the Member State that designated it or another independent administrative authority of that Member State to issuethe judicial authority, a detection orderwarrant requiring a provider of hosting services or a provider of number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse onmaterial related to specific terminal equipment or a specific uservice account, where there is a reasonable suspicion such content is stored on that terminal equipment or in that user account.
2023/07/28
Committee: LIBE
Amendment 904 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 2
To that end, it may, where appropriate, require the provider to submit the necessary information, additional to the report and the further information referred to in Article 5(1) and (3), respectively, within a reasonable time period set by that Coordinating Authority, or request the EU Centre, another public authority or relevant experts or entities to provide the necessary additional information.
2023/07/28
Committee: LIBE
Amendment 910 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point a
(a) establish a draft request to the competent judicial authority of the Member State that designated it for the issuance of a detection orderwarrant, specifying the main elements of the content of the detection orderwarrant it intends to request and the reasons for requesting it;
2023/07/28
Committee: LIBE
Amendment 913 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point b
(b) submit the draft request to the provider and the EU Centre;deleted
2023/07/28
Committee: LIBE
Amendment 917 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point c
(c) afford the provider an opportunity to comment on the draft request, within a reasonable time period set by that Coordinating Authority;deleted
2023/07/28
Committee: LIBE
Amendment 918 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point d
(d) invite the EU Centre to provide its opinion on the draft request, within a time period of four weeks from the date of receiving the draft request.deleted
2023/07/28
Committee: LIBE
Amendment 921 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point d a (new)
(da) Request the supervisory authorities designated pursuant to Chapter VI, Section 1, of Regulation (EU) 2016/678 to perform their tasks within the competence pursuant to Chapter VI, Section 2 of Regulation (EU) 2016/678 and provide thei opinion on the draft request, within a reasonable time period set by that Coordinating Authority;
2023/07/28
Committee: LIBE
Amendment 923 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2
Where, having regard to the comments of the provider and the opinion of the EU Centre, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall re-submit the draft request, adjusted where appropriate, to the provider. In that case, the provider shall do all of the following, within a reasonable time period set by that Coordinating Authority: (a) draft an implementation plan setting out the measures it envisages taking to execute the intended detection order, including detailed information regarding the envisaged technologies and safeguards; (b) where the draft implementation plan concerns an intended detection order concerning the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct a data protection impact assessment and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation plan; (c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessment and in order to take into account the opinion of the data protection authority provided in response to the prior consultation; (d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted in view of the outcome of the data protection impact assessment and of that opinion.deleted
2023/07/28
Committee: LIBE
Amendment 944 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 3
Where, having regard to the implementation plan of the provider and the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the issuance of the detection, adjusted where appropriate, to the competent judicial authority or independent administrative authority. It shall attach the implementation plan of the provider and. It shall attach the opinions of the EU Centre and the data protection authority to that request.
2023/07/28
Committee: LIBE
Amendment 959 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a
(a) there is evidence of a significant risk of the servicsubstantive evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse, within the meaning of paragraphs 5, 6 and 7, as applicable;
2023/07/28
Committee: LIBE
Amendment 962 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a a (new)
(aa) the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;
2023/07/28
Committee: LIBE
Amendment 968 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
(ba) The detection warrant does not affect the security and confidentiality of communications on a general scale.
2023/07/28
Committee: LIBE
Amendment 970 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b b (new)
(bb) The technology used to protect the communication, such as any kind of encryption, shall not be affected or undermined by the detection warrant.
2023/07/28
Committee: LIBE
Amendment 977 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point -a (new)
(-a) the availability of information to adequately describe the specific purpose and scope of the order, including the legal basis for the suspicion;
2023/07/28
Committee: LIBE
Amendment 984 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point c
(c) the views, including on the technical feasibility, and the implementation plan of the provider submitted in accordance with paragraph 3;
2023/07/28
Committee: LIBE
Amendment 986 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point d
(d) the opinions of the EU Centre and of the data protection adata protection authority submitted in accordance with paragraph 3 and, where applicable, the opinion of the Coordinating Authority issubmitted in accordance with Article 5, paragraph 34b.
2023/07/28
Committee: LIBE
Amendment 991 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 3
As regards the second subparagraph, point (d), wWhere that Coordinating Authority substantially deviates from the opinion of the EU Centre, it shall inform the EU Centredata protection authorities, it shall inform the data protection authorities and the Commission thereof, specifying the points at which it deviated and the main reasons for the deviation.
2023/07/28
Committee: LIBE
Amendment 995 #
Proposal for a regulation
Article 7 – paragraph 5
5. As regards detection orders concerning the dissemination of known child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely, despite any mitigation measures that the provider may have taken or will take, that the service is used, to an appreciable extent for the dissemination of known child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent for the dissemination of known child sexual abuse material.deleted
2023/07/28
Committee: LIBE
Amendment 1004 #
Proposal for a regulation
Article 7 – paragraph 6
6. As regards detection orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the dissemination of new child sexual abuse material; (c) for services other than those enabling the live transmission of pornographic performances as defined in Article 2, point (e), of Directive 2011/93/EU: (1) a detection order concerning the dissemination of known child sexual abuse material has been issued in respect of the service; (2) the provider submitted a significant number of reports concerning known child sexual abuse material, detected through the measures taken to execute the detection order referred to in point (1), pursuant to Article 12.deleted
2023/07/28
Committee: LIBE
Amendment 1010 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1
As regards detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) the provider qualifies as a provider of interpersonal communication services; (b) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the solicitation of children; (c) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the solicitation of children.deleted
2023/07/28
Committee: LIBE
Amendment 1013 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 2
The detection orders concerning the solicitation of children shall apply only to interpersonal communications where one of the users is a child user.deleted
2023/07/28
Committee: LIBE
Amendment 1019 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall in accordance with Article 8 of Regulation (EU) 2022/2065 target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary, justifiable and proportionate to effectively address the significant riskreasonable suspicion referred to in point (a) thereof.
2023/07/28
Committee: LIBE
Amendment 1024 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
To that aim, they shall take into account all relevant parameters, including the technical feasability, availability of sufficiently reliable detection technologies in that they limit to the maximum extent possible the rate of errors regarding the detection and their suitability and effectiveness for achieving the objectives of this Regulation, as well as the impact of the measures on the rights of the users affected, and require the taking of the least intrusive measures, in accordance with Article 10, from among several equally effective measuresin particular the risk of inaccurately identifying lawful speech as illegal content, as well as the impact of the measures on the rights of the users affected and on the security, integrity and confidentiality of their communications, and require the taking of the least intrusive measures, in accordance with Article 10, from among several equally effective measures. To this end, they shall ensure technologies are able to distinguish between known child abuse material and lawful speech accurately enough that no human intervention is needed.
2023/07/28
Committee: LIBE
Amendment 1028 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3
In particular, they shall ensure that: (a) where that risk is limited to an identifiable part or component of a service, the required measures are only applied in respect of that part or component; (b) where necessary, in particular to limit such negative consequences, effective and proportionate safeguards additional to those listed in Article 10(4), (5) and (6) are provided for; (c) subject to paragraph 9, the period of application remains limited to what is strictly necessary.deleted
2023/07/28
Committee: LIBE
Amendment 1043 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3
The period of application of detection orderwarrants concerning the dissemination of known or new child sexual abuse material shall not exceed 24 months and that of detection orders concerning the solicitation of children shall not exceed 126 months.
2023/07/28
Committee: LIBE
Amendment 1046 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3 a (new)
The European Data Protection Board shall also issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the detection of child sexual abuse material in encrypted and non-encrypted environments.Supervisory authorities as referred to in that Regulation shall supervise the application of those guidelines. Prior to the use of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
2023/07/28
Committee: LIBE
Amendment 1047 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3 b (new)
The competent supervisory authorities designated pursuant to Chapter VI, Section 1, of Regulation (EU) 2016/678 shall have the right to challenge a detection warrant within the competence pursuant to Chapter VI, Section 2 of Regulation (EU) 2016/678 before the courts of the Member State of the competent judicial authority that issued the detection warrant.
2023/07/28
Committee: LIBE
Amendment 1056 #
Proposal for a regulation
Article 8 – title
Additional rules regarding detection orderwarrants
2023/07/28
Committee: LIBE
Amendment 1061 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detection orderwarrants referred to in Article 7 using the template set out in Annex I. Detection orderwarrants shall include:
2023/07/28
Committee: LIBE
Amendment 1066 #
Proposal for a regulation
Article 8 – paragraph 1 – point a a (new)
(aa) information, with respect to each device or user account, detailing the specific purpose and scope of the warrant, including the legal basis for the reasonable suspicion.
2023/07/28
Committee: LIBE
Amendment 1072 #
Proposal for a regulation
Article 8 – paragraph 1 – point e
(e) whether the detection order issued concerns the dissemination of known or new child sexual abuse material or the solicitation of children;deleted
2023/07/28
Committee: LIBE
Amendment 1077 #
Proposal for a regulation
Article 8 – paragraph 1 – point g
(g) a sufficiently detailed statement of reasjustifications explaining why the detection orderwarrant is issued and how it is necessary, effective and proportionate;
2023/07/28
Committee: LIBE
Amendment 1102 #
Proposal for a regulation
Article 9 – title
9 Redress, information, reporting and modification of detection orderwarrants
2023/07/28
Committee: LIBE
Amendment 1105 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of hosting services and providers of number-independent interpersonal communications services that have received a detection orderwarrant, as well as users affected by the measures taken to execute it, shall have a right to information and effective redress. That right shall include the right to challenge the detection orderwarrant before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
2023/07/28
Committee: LIBE
Amendment 1130 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of hosting services and providers of number-independent interpersonal communication services that have received a detection orderwarrant shall execute it by installing and operating technologiessecure and privacy-friendly technologies, approved by the Centre, to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
2023/07/28
Committee: LIBE
Amendment 1145 #
Proposal for a regulation
Article 10 – paragraph 3 – point a
(a) effective in detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
2023/07/28
Committee: LIBE
Amendment 1147 #
Proposal for a regulation
Article 10 – paragraph 3 – point b
(b) not be able to extract any other information from the relevant communications than the information strictly necessary to detect, using the indicators referred to in paragraph 1, patterns pointing to the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
2023/07/28
Committee: LIBE
Amendment 1149 #
Proposal for a regulation
Article 10 – paragraph 3 – point c
(c) in accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users’ rights to private and family life, including the confidentiality of communication, and to protection of personal data. It shall not weaken or undermine end-to-end encryption and shall not limit providers of information society services from providing their services applying end-to- end encryption;
2023/07/28
Committee: LIBE
Amendment 1158 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) ensure that the interference with the fundamental right to privacy and the other rights laid down in the Charter is limited to what is strictly necessary.
2023/07/28
Committee: LIBE
Amendment 1169 #
Proposal for a regulation
Article 10 – paragraph 4 – point -a (new)
(-a) ensure privacy by design and safety-by-design and by default and, where applicable, the protection of encryption.
2023/07/28
Committee: LIBE
Amendment 1179 #
(c) ensure regularcontinuous human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, immediate human intervention;
2023/07/28
Committee: LIBE
Amendment 1183 #
Proposal for a regulation
Article 10 – paragraph 4 – point d
(d) establish and operate an accessible, age-appropriate and user- and child- friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
2023/07/28
Committee: LIBE
Amendment 1184 #
Proposal for a regulation
Article 10 – paragraph 4 – point e
(e) inform the Coordinating Authority and competent Data Protection Authority, at the latest one month before the start date specified in the detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
2023/07/28
Committee: LIBE
Amendment 1187 #
Proposal for a regulation
Article 10 – paragraph 4 – point e a (new)
(ea) request in respect of any specific technology used for the purpose set out in this Article, a prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679, and request a prior consultation procedure as referred to in Article 36 of that Regulation have been conducted;
2023/07/28
Committee: LIBE
Amendment 1190 #
Proposal for a regulation
Article 10 – paragraph 4 a (new)
4a. in respect of any specific technology used for the purpose set out in this Article, conduct a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation;
2023/07/28
Committee: LIBE
Amendment 1192 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point a
(a) the fact that it operates technologies to detect onlineknown child sexual abuse material to execute the detection orderwarrant, the ways in which it operates those technologies and the impact on the users’ fundamental rights to private and family life, including the confidentiality of users’ communications and the protection of personal data;
2023/07/28
Committee: LIBE
Amendment 1212 #
Proposal for a regulation
Article 11 – paragraph 1
The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having consulted the European Data Protection Board and having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments and the manners in which the services covered by those provisions are offered and used.
2023/07/28
Committee: LIBE
Amendment 1215 #
Proposal for a regulation
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of number- independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potentiallleged online child sexual abuse on its services, it shall promptly report, without delay, that abuse to the competent law enforcement and independent judicial authorities and submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
2023/07/28
Committee: LIBE
Amendment 1229 #
Proposal for a regulation
Article 12 – paragraph 3
3. The provider shall establish and operate an accessible, age-appropriat, the EU centre, the competent authority or any judicial enforcement bodies , shall, without undue delay, notify the aind user-friendly mechanismividual or entity that have notified thate allows users to flag to the provider potential online child sexual abuseeged online child sexual abuse, of their decision in respect of the information to which the notified content relates, providing information on the possibilities for redress in respect onf the serviceat decision.
2023/07/28
Committee: LIBE
Amendment 1235 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number-independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
2023/07/28
Committee: LIBE
Amendment 1240 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) all content data, including images, videos and text;
2023/07/28
Committee: LIBE
Amendment 1243 #
Proposal for a regulation
Article 13 – paragraph 1 – point c a (new)
(ca) where applicable, an exact uniform resource locator and, where necessary, additional information for the identification of the child sexual abuse material;
2023/07/28
Committee: LIBE
Amendment 1246 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) all available data other than content data related to the potential online child sexual abuse;deleted
2023/07/28
Committee: LIBE
Amendment 1253 #
Proposal for a regulation
Article 13 – paragraph 1 – point f
(f) information concerning the geographic location related to the potential online child sexual abuse, such as the Internet Protocol address;deleted
2023/07/28
Committee: LIBE
Amendment 1265 #
Proposal for a regulation
Article 13 – paragraph 1 – point j a (new)
(ja) information on the tools used by the provider to become aware of the reported online child sexual abuse, including data and aggregate statistics on how technologies used by the provider work;
2023/07/28
Committee: LIBE
Amendment 1268 #
Proposal for a regulation
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1273 #
Proposal for a regulation
Article 14 – paragraph 2
2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
2023/07/28
Committee: LIBE
Amendment 1286 #
1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the removal order.
2023/07/28
Committee: LIBE
Amendment 1293 #
Proposal for a regulation
Chapter II – Section 5
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1297 #
Proposal for a regulation
Article 16
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1312 #
Proposal for a regulation
Article 17
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1321 #
Proposal for a regulation
Article 18
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1331 #
Proposal for a regulation
Article 19 – paragraph 1
Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocklabelling, or reporting online child sexual abuse in accordance with those requirements.
2023/07/28
Committee: LIBE
Amendment 1333 #
Proposal for a regulation
Article 20 – title
20 Victims’Survivors' right to information amd support
2023/07/28
Committee: LIBE
Amendment 1337 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
PersonsAny survivor, including child survivors and, after obtaining consent of the child, a parent of child survivors or their legal representative, residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they reside, age-appropriate information regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12 and referral to support services. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them.
2023/07/28
Committee: LIBE
Amendment 1342 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1 a (new)
The Coordinating Authority shall ensure that survivors, including child survivors and parents of child survivors, are informed about survivor support services where the survivors can receive age- appropriate and gender-sensitive information and support.
2023/07/28
Committee: LIBE
Amendment 1351 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
(da) information regarding age- appropriate and gender-sensitive survivor support services to provide the child, family and survivors with adequate emotional and psychosocial support as well as practical and legal assistance.
2023/07/28
Committee: LIBE
Amendment 1355 #
Proposal for a regulation
Article 21 – title
VictimSurvivors’ right of assistance and support for removal
2023/07/28
Committee: LIBE
Amendment 1359 #
Proposal for a regulation
Article 21 – paragraph 1
1. Providers of hostingrelevant information society services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
2023/07/28
Committee: LIBE
Amendment 1367 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
2023/07/28
Committee: LIBE
Amendment 1374 #
Proposal for a regulation
Article 21 a (new)
Article21a Right to lodge a complaint with a supervisory authority 1. Without prejudice to any other administrative or judicial remedy, every user shall have the right to lodge a complaint with a supervisory authority, in particular in the Member State of his or her habitual residence, place of work or place of the alleged infringement if the user considers that the processing of personal data relating to him or her infringes this Regulation or Regulation (EU) 2016/679. 2. The supervisory authority with which the complaint has been lodged shall inform the complainant on the progress and the outcome of the complaint including the possibility of a judicial remedy pursuant to Article 21b.
2023/07/28
Committee: LIBE
Amendment 1375 #
Proposal for a regulation
Article 21 b (new)
Article21b Right to an effective judicial remedy against a provider of a hosting services or a providers of a number-independent interpersonal communications service 1. Without prejudice to any available administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority pursuant to 21a, each user shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation or Regulation (EU) 2016/679. 2. Proceedings against a provider of a hosting service or a provider of a number- independent interpersonal communications service shall be brought before the courts of the Member State where the provider has an establishment. Alternatively, such proceedings may be brought before the courts of the Member State where the user has his or her habitual residence.
2023/07/28
Committee: LIBE
Amendment 1377 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Providers of hosting services and providers of number-independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
2023/07/28
Committee: LIBE
Amendment 1384 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 2
As regards the first subparagraph, point (a), the provider may also preserve the information for the purpose of improving the effectiveness and accuracy of the technologies to detect online child sexual abuse for the execution of a detection order issued to it in accordance with Article 7. However, it shall not store any personal data for that purpose.deleted
2023/07/28
Committee: LIBE
Amendment 1402 #
Proposal for a regulation
Article 25 – paragraph 5
5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and, shall disseminate this information through gender-sensitive awareness raising campaigns in public places frequented by children, and girls in particular, and shall communicate it to the EU Centre. They shall keep that information updated.
2023/07/28
Committee: LIBE
Amendment 1415 #
Proposal for a regulation
Article 25 – paragraph 7 – point d a (new)
(da) provide knowledge and expertise on appropriate prevention techniques tailored by age and gender against online solicitation of children and the dissemination of child sexual abuse material online.
2023/07/28
Committee: LIBE
Amendment 1417 #
Proposal for a regulation
Article 25 – paragraph 8 a (new)
8a. The EU Centre shall support Member States in designing preventive and gender-sensitive measures, such as awareness-raising campaigns to combat child sexual abuse, guaranteeing comprehensive sexuality and relationships education in all schools, introducing digital skills, literacy and safety online programs in formal education, ensuring the full availability of specialized support services tailored by gender and age for child survivors of sexual abuse and children in vulnerable situations.
2023/07/28
Committee: LIBE
Amendment 1418 #
Proposal for a regulation
Article 25 – paragraph 9 a (new)
9a. In its contact with survivors or in any decision affecting survivors, the Coordinating Authority shall operate in an age-appropriate and gender-sensitive way that minimises risks to survivors, especially children, addresses harm of survivors and meets their needs. It shall operate in a victim and gender sensitive manner which prioritises recognising and listening to the survivor, avoids secondary victimisation and retraumatisation, and systematically focuses on their safety, rights, well-being, expressed needs and choices, and ensures they are treated in an empathetic, sensitive and non- judgmental way.
2023/07/28
Committee: LIBE
Amendment 1478 #
Proposal for a regulation
Article 35 – paragraph 4 a (new)
4a. Member States shall ensure that penalties imposed for the infringement of this Regulation do not encourage the over reporting or the removal of material which does not constitute child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1479 #
Proposal for a regulation
Article 35 a (new)
Article35a Compensation Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to seek, in accordance with Union and national law, compensation from providers of relevant information society services, for any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.
2023/07/28
Committee: LIBE
Amendment 1510 #
Proposal for a regulation
Article 38 – paragraph 1 – subparagraph 1
Coordinating Authorities shall share best practice standards and guidance on the detection and removal of child sexual abuse material and may participate in joint investigations, which may be coordinated with the support of the EU Centre, of matters covered by this Regulation, concerning providers of relevant information society services that offer their services in several Member States. Those joint investigations shall also take place on the darkweb.
2023/07/28
Committee: LIBE
Amendment 1514 #
Proposal for a regulation
Article 38 – paragraph 2 a (new)
2a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of online child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to remove content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
2023/07/28
Committee: LIBE
Amendment 1525 #
Proposal for a regulation
Article 39 – paragraph 3 a (new)
3a. Where the EU Centre receives a report from a hotline, or where a provider that submitted the report to the EU Centre has indicated that the report is based on the information received from a hotline, the EU Centre shall coordinate with the relevant Coordinating Authorities in order to avoid duplicated reporting on the same material that has already been reported to the national law enforcement authorities by the hotlines, and monitor the removal of the child sexual abuse material or cooperate with the relevant hotline to track the status.
2023/07/28
Committee: LIBE
Amendment 1539 #
Proposal for a regulation
Article 42 – paragraph 1
The seat of the EU Centre shall be The Hague, The Netherlandchoice of the location of the seat of the Centre shall be made in accordance with the ordinary legislative procedure, based on the following criteria: (a) it shall not affect the Centre’s execution of its tasks and powers, the organisation of its governance structure, the operation of its main organisation, or the main financing of its activities; (b) it shall ensure that the Centre is able to recruit the high-qualified and specialised staff it requires to perform the tasks and exercise the powers provided by this Regulation; (c) it shall ensure that it can be set up on site upon the entry into force of this Regulation; (d) it shall ensure appropriate accessibility of the location, the existence of adequate education facilities for the children of staff members, appropriate access to the labour market, social security and medical care for both children and spouses; (da) it shall ensure a balanced geographical distribution of EU institutions, bodies and agencies across the Union; (db) it shall ensure its national Child Sexual Abuse framework is of a proven quality and repute, and shall benefit from the experience of national authorities; (dc) it shall enable adequate training opportunities for combating child sexual abuse activities; (dd) it shall enable close cooperation with EU institutions, bodies and agencies but it shall be independent of any of the aforementioned; (de) it shall ensure sustainability and digital security and connectivity with regards to physical and IT infrastructure and working conditions.
2023/07/28
Committee: LIBE
Amendment 1553 #
Proposal for a regulation
Article 43 – paragraph 1 – point 2
(2) facilitate the detection process referred to in Section 2 of Chapter II, by: (a) providing the opinions on intended detection orders referred to in Article 7(3), first subparagraph, point (d); (b) maintaining and operating the databases of indicators referred to in Article 44; (c) giving providers of hosting services and providers of interpersonal communications services that received a detection order access to the relevant databases of indicators in accordance with Article 46; (d) making technologies available to providers for the execution of detection orders issued to them, in accordance with Article 50(1);deleted
2023/07/28
Committee: LIBE
Amendment 1572 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point a
(a) collecting, recording, analysing and providing gender and age specific information, providing analysis based on anonymised and non-personal data gathering, including gender and age disaggregated data, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51;
2023/07/28
Committee: LIBE
Amendment 1575 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b
(b) supporting the development and dissemination of research and expertise on those matters and on assistance to victimssurvivors, taking into account the gender dimension, including by serving as a hub of expertise to support evidence-based policy;
2023/07/28
Committee: LIBE
Amendment 1577 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) providing technical expertise and promoting the exchange of best practices among Member States on raising awareness for the prevention of child sexual abuse online in formal and non- formal education. Such efforts shall be age-appropriate and gender-sensitive;
2023/07/28
Committee: LIBE
Amendment 1582 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b b (new)
(bb) exchanging best practices among Coordinating Authorities regarding the available tools to reduce the risk of children becoming victims of sexual abuse and to provide specialized assistance to survivors, in an age-appropriate and gender-sensitive way.
2023/07/28
Committee: LIBE
Amendment 1585 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b c (new)
(bc) referring survivors to appropriate child protection services;
2023/07/28
Committee: LIBE
Amendment 1587 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c a (new)
(ca) in its engagement with survivors or in any decision affecting survivors, the EU Centre shall operate in a way that minimises risks to survivors, especially children, addresses harm of survivors and meets their needs in an age-appropriate, and gender- and victim-sensitive manner.
2023/07/28
Committee: LIBE
Amendment 1588 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c b (new)
(cb) create and oversee an "EU hashing list of known child sexual abuse material" and modify the content of that list, independently and autonomously and free of political, government or industry influence or interference;
2023/07/28
Committee: LIBE
Amendment 1589 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c c (new)
(cc) develop, in accordance with the implementing act as referred to in Article 43a, the European Centralised Helpline for Abuse of Teenagers (eCHAT), interconnecting via effective interoperability the national hotline's helplines, allowing children to reach out 24/7 via a recognisable central helpline in an anonymous way in their own language and free of charge;
2023/07/28
Committee: LIBE
Amendment 1590 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c d (new)
(cd) dispose over the resources needed to develop, where possible, open source, hashing technology tools for small and medium sized relevant information society services to prevent the dissemination of known child sexual abuse material in publicly accessible content.
2023/07/28
Committee: LIBE
Amendment 1591 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c e (new)
(ce) coordinate sharing and filter of Suspicious Activity Reports on alleged "known child sexual abuse material", operating independently, autonomously, free of political, government or industry influence or interference and in full respect of fundamental rights, including privacy and data protection. [By 1 year after entry into force] the Commission shall adopt a delegated act laying down requirements for a Suspicious Activy Reports format, as referred to in this paragraph, and the differentiation between actionable and non-actionable Suspicious Activity Reports. This delegated act shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 1592 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c f (new)
(cf) scan public servers and public communications channels for known child sexual abuse material, with proven technology, solely for the purposes of amending the EU Hashing List and flagging the content for removal to the service provider of the specific public server or public communications channel, without prejudice to Art. -3. The European Data Protection Board shall issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the purpose of scanning.
2023/07/28
Committee: LIBE
Amendment 1597 #
Proposal for a regulation
Article 43 a (new)
Article43a Implementing act for the interconnection of helplines 1. The national helpline referred to in Article 43 shall be interconnected via the European Centralised Helpline for Abuse of Teenagers (eCHAT) to be developed and operated by the EU Centre by ... [two years after the date of entry into force of this Regulation] 2. The Commission shall be empowered to adopt, by means of implementing acts, technical specifications and procedures necessary to provide for the interconnection of national hotlines' online chat systems via eCHAT in accordance with Article 43 with regard to: (a) the technical data necessary forthe eCHAT system to perform itsfunctions and the method of storage, useand protection of that technical data; (b) the common criteria according to which national helplines shall be available through the system of interconnection of helplines; (c) the technical details on how helplines shall be madeavailable; (d) the technical conditions of availability of services provided by the system of interconnection of helplines. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 5 of Regulation (EU) 182/2011. 3. When adopting the implementingacts referred to in paragraph 2, the Commission shall take into account proven technology and existing practices.
2023/07/28
Committee: LIBE
Amendment 1603 #
Proposal for a regulation
Article 44 – paragraph 1 – point b
(b) indicators to detect the dissemination of child sexual abuse material not previously detected and identified as constituting child sexual abuse material in accordance with Article 36(1);deleted
2023/07/28
Committee: LIBE
Amendment 1606 #
Proposal for a regulation
Article 44 – paragraph 1 – point c
(c) indicators to detect the solicitation of children.deleted
2023/07/28
Committee: LIBE
Amendment 1704 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 3
Before including specific technologies on those lists, the EU Centre shall request the opinion of its Technology Committee, the Experts Consultative Forum, and of the European Data Protection Board. The Technology Committee and the European Data Protection Board shall deliver their respective opinions within eight weeks. That period may be extended by a further six weeks where necessary, taking into account the complexity of the subject matter. The Technology Committee and the European Data Protection Board shall inform the EU Centre of any such extension within one month of receipt of the request for consultation, together with the reasons for the delay. Where the EU Centre substantially deviates from those opinions, it shall inform the Technology Committee or the European Data Protection Board and the Commission thereof, specifying the points at which it deviated and the main reasons for the deviation.
2023/07/28
Committee: LIBE
Amendment 1716 #
Proposal for a regulation
Article 50 – paragraph 5
5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. Communication campaigns shall be easily understandable and accessible to all children, their families and educators in formal, and non-formal education in the Union, aiming to improve digital literacy and ensure a safe digital environment for children. Communication campaigns shall take into account the gender dimension of the crime.
2023/07/28
Committee: LIBE
Amendment 1742 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 1
Europol and the EU Centre shall provide each other with the fullest possible access to relevant information and information systems, where necessary for the performance of their respective tasks and in accordance with the acts of Union law regulating such access.deleted
2023/07/28
Committee: LIBE
Amendment 1745 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 2
Without prejudice to the responsibilities of the Executive Director, the EU Centre shall maximise efficiency by sharing administrative functions with Europol, including functions relating to personnel management, information technology (IT) and budget implementation.deleted
2023/07/28
Committee: LIBE
Amendment 1753 #
Proposal for a regulation
Article 53 – paragraph 3
3. The terms of cooperation and working arrangements shall be laid down in a publically accessible memorandum of understanding.
2023/07/28
Committee: LIBE
Amendment 1762 #
Proposal for a regulation
Article 55 – paragraph 1 – introductory part
The administrative and management structure of the EU Centre shall be gender- balanced and comprise:
2023/07/28
Committee: LIBE
Amendment 1764 #
Proposal for a regulation
Article 55 – paragraph 1 – point d a (new)
(da) a Fundamental Rights Officer, which shall exercise the tasks set out in Article 66b;
2023/07/28
Committee: LIBE
Amendment 1765 #
Proposal for a regulation
Article 55 – paragraph 1 – point d b (new)
(db) an Expert's Consultative Forum, which shall exercise the tasks set out in Article 66a;
2023/07/28
Committee: LIBE
Amendment 1767 #
Proposal for a regulation
Article 56 – paragraph 1
1. The Management Board shall be gender-balanced and composed of one representative from each Member State and two representatives of the Commission, all as members with voting rights.
2023/07/28
Committee: LIBE
Amendment 1775 #
Proposal for a regulation
Article 56 – paragraph 4
4. Members of the Management Board and their alternates shall be appointed in the light of their knowledge in the field of combating child sexual abuse, taking into account relevant managerial, administrative and budgetary skills. Member States shall appoint a representative of their Coordinating Authority, within four months of [date of entry into force of this Regulation]. All parties represented in the Management Board shall make efforts to limit turnover of their representatives, in order to ensure continuity of its work. All parties shall aim to achieve a balanced representationensure that gender balance between men and women is achieved on the Management Board with at least 40% of candidates of each sex.
2023/07/28
Committee: LIBE
Amendment 1779 #
Proposal for a regulation
Article 57 – paragraph 1 – point f
(f) appoint the members of the Technology Committee, the Expert's Consultative Forum and of any other advisory group it may establish;
2023/07/28
Committee: LIBE
Amendment 1780 #
Proposal for a regulation
Article 57 – paragraph 1 – point f a (new)
(fa) appoint a Data Protection Officer;
2023/07/28
Committee: LIBE
Amendment 1781 #
Proposal for a regulation
Article 57 – paragraph 1 – point f b (new)
(fb) appoint a Fundamental Rights Officer;
2023/07/28
Committee: LIBE
Amendment 1786 #
Proposal for a regulation
Article 61 – paragraph 1 – subparagraph 1
The Executive Board shall be gender- balanced and composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and two representatives of the Commission to the Management Board. The Chairperson of the Management Board shall also be the Chairperson of the Executive Board. The composition of the Executive Board shall take into consideration gender balance with at least 40% is of each sex.
2023/07/28
Committee: LIBE
Amendment 1806 #
Proposal for a regulation
Article 66 a (new)
Article66a Establishment and tasks of the Expert's Consultative Forum 1. The EU Centre shall establish a Consultative Forum to assist it by providing it with independent advice on survivors related matters. The Consultative Forum shall act upon request of the Management Board or the Executive Director. 2. The Consultative Forum shall consist of a maximum of fifteen members. Members of the Consultative Forum shall, in an equal matter, be appointed from child survivors and parents of child survivors, as well as representatives of organizations acting in the public interest, including: (a) organizations representing or promoting rights of the LGBTQIA+ community, specifically minors; (b) organizations representing or promoting children's rights; (b) organizations representing or promoting child survivors rights; (c) organizations representing or promoting digital rights They shall be appointed by the Management Board following the publication of a call for expression of interest in the Official Journal of the European Union. 3. The mandate of members of the Consultative Forum shall be of four years. Those mandates shall be renewable once. 4. The Consultative Forum shall: a) provide the Management Board and the Executive Director with advice on matters related to survivors; b) provide the Management Board, the Executive Director and the Technology Committee with advice on preventive measures for relevant information society services; c) contribute to the EU Centre communication strategy referred to in Article 50(5); d) provide its opinion on the proportionality of technologies used to detect known child sexual abuse; e) maintain an open dialogue with the Management Board and the Executive Director on all matters related to survivors, particularly on the protection of survivors’ rights and digital rights.
2023/07/28
Committee: LIBE
Amendment 1807 #
Proposal for a regulation
Chapter IV – Section 5 – Part 3 a (new)
3a Part 3 a (new): Fundamental Rights Protection Article 66b Fundamental rights officer 1. A fundamental rights officer shall be appointed by the management board on the basis of a list of three candidates, after consultation with the Expert's Consultative Forum. The fundamental rights officer shall have the necessary qualifications, expert knowledge and professional experience in the field of fundamental rights. 2. The fundamental rights officer shall perform the following tasks: (a) contributing to the Centre's fundamental rights strategy and the corresponding action plan, including by issuing recommendations for improving them; (b) monitoring the Centre's compliance with fundamental rights, including by conducting investigations into any of its activities; (c) promoting the Centre's respect of fundamental rights; (d) advising the Centre where he or she deems it necessary or where requested on any activity of the Centre without dagelaying those activities; (e) providing opinions on working arrangements; (f) providing the secretariat of the consultative forum; (g) informing the management board and executive director about possible violations of fundamental rights during activities of the Centre; (h) performing any other tasks, where provided for by this Regulation. 3. The Management Board shall lay down special rules applicable to the fundamental rights officer in order to guarantee that the fundamental rights officer and his or her staff are independent in the performance of their duties. The fundamental rights officer shall report directly to the Management Board and shall cooperate with the Technology Committee. The management board shall ensure that action is taken with regard to recommendations of the fundamental rights officer. In addition, the fundamental rights officer shall publish annual reports on his or her activities and on the extent to which the activities of the Centre respect fundamental rights. Those reports shall include information on the complaints mechanism and the implementation of the fundamental rights strategy. 4. The Centre shall ensure that the fundamental rights officer is able to act autonomously and is able to be independent in the conduct of his or her duties. The fundamental rights officer shall have sufficient and adequate human and financial resources at his or her disposal necessary for the fulfilment of his or her tasks. The fundamental rights officer shall select his or her staff, and that staff shall only report to him or her. 5. The fundamental rights officer shall be assisted by a deputy fundamental rights officer. The deputy fundamental rights officer shall be appointed by the management board from a list of at least three candidates presented by the fundamental rights officer. The deputy fundamental rights officer shall have the necessary qualifications and experience in the field of fundamental rights and shall be independent in the conduct of his or her duties. If the fundamental rights officer is absent or indisposed, the deputy fundamental rights officer shall assume the fundamental rights officer's duties and responsibilities. 6. The fundamental rights officer shall have access to all information concerning respect for fundamental rights in all the activities of the Centre. Article 66c Complaints mechanism 1. The Centre shall, in cooperation with the fundamental rights officer, take the necessary measures to set up and further develop an independent and effective complaints mechanism in accordance with this Article to monitor and ensure respect for fundamental rights in all the activities of the Centre. 2. Any person who is directly affected by the actions or failure to act on the part of staff involved in a joint operation, pilot project, or an operational activity of the Centre, and who considers himself or herself to have been the subject of a breach of his or her fundamental rights due to those actions or that failure to act, or any party representing such a person, may submit a complaint in writing to the Centre. 3. The fundamental rights officer shall be responsible for handling complaints received by the Centre in accordance with the right to good administration. For that purpose, the fundamental rights officer shall review the admissibility of a complaint, register admissible complaints, forward all registered complaints to the executive director and forward complaints concerning members of the teams to the relevant authority or body competent for fundamental rights for further action in accordance with their mandate. The fundamental rights officer shall also register and ensure the follow-up by the Centre or that authority or body. 4. In accordance with the right to good administration, if a complaint is admissible, complainants shall be informed that the complaint has been registered, that an assessment has been initiated and that a response may be expected as soon as it becomes available. If a complaint is forwarded to national authorities or bodies, the complainant shall be provided with their contact details. If a complaint is declared inadmissible, the complainant shall be informed of the reasons and, if possible, provided with further options for addressing their concerns. The Centre shall provide for an appropriate procedure in cases where a complaint is declared inadmissible or unfounded. Any decision shall be in written form and reasoned. The fundamental rights officer shall reassess the complaint if the complainant submits new evidence in situations where the complaint has been declared inadmissible or unfounded. 5. In the case of a registered complaint concerning a staff member of the Centre, the fundamental rights officer shall recommend appropriate follow-up, including disciplinary measures, to the executive director and, where appropriate, referral for the initiation of civil or criminal justice proceedings in accordance with this Regulation and national law. The executive director shall ensure the appropriate follow-up and shall report back to the fundamental rights officer within a determined timeframe and, if necessary, at regular intervals thereafter, as to the findings, the implementation of disciplinary measures, and follow-up by the Centre in response to a complaint. If a complaint is related to data protection issues, the executive director shall consult the data protection officer of the Centre before taking a decision on the complaint. The fundamental rights officer and the data protection officer shall establish, in writing, a memorandum of understanding specifying their division of tasks and cooperation as regards complaints received. 6. The fundamental rights officer shall include information on the complaints mechanism in his or her annual report, as referred to in Article 66a, including specific references to the Centre's findings and the follow-up to complaints. 7. The fundamental rights officer shall, in accordance with paragraphs 1 to 9 and after consulting the experts council, draw up a standardised complaint form requiring detailed and specific information concerning the alleged breach of fundamental rights. The fundamental rights officer shall also draw up any further detailed rules as necessary. The fundamental rights officer shall submit that form and such further detailed rules to the executive director and to the management board. The Centre shall ensure that information about the possibility and procedure for making a complaint is readily available, including for vulnerable persons. The standardised complaint form shall be made available on the Centre's website and in hardcopy during all activities of the Centre in languages that third-country nationals understand or are reasonably believed to understand. The standardised complaint form shall be easily accessible, including on mobile devices. The Centre shall ensure that further guidance and assistance on the complaints procedure is provided to complainants. Complaints shall be considered by the fundamental rights officer even when they have not been submitted in the standardised complaint form. 8. Any personal data contained in a complaint shall be handled and processed by the Centre, including the fundamental rights officer, in accordance with Regulation (EU) 2018/1725. Where a complainant submits a complaint, that complainant shall be understood to consent to the processing of his or her personal data by the Centre and the fundamental rights officer within the meaning of point (d) of Article 5(1) of Regulation (EU) 2018/1725. In order to safeguard the interests of the complainants, complaints shall be dealt with confidentially by the fundamental rights officer in accordance with national and Union law unless the complainant explicitly waives his or her right to confidentiality. When complainants waive their right to confidentiality, it shall be understood that they consent to the fundamental rights officer or the Centre disclosing their identity to the competent authorities or bodies in relation to the matter under complaint, where necessary.
2023/07/28
Committee: LIBE
Amendment 1877 #
Proposal for a regulation
Article 84 – paragraph 1
1. Each provider of relevant information society services shall draw up an annual report on its activities under this Regulation. That report shall compile the information referred to in Article 83(1). The providers shall, by 31 January of every year subsequent to the year to which the report relates, make the report available to the public in a machine-readable format and communicate it to the Coordinating Authority of establishment, the Commission and the EU Centre.
2023/07/28
Committee: LIBE
Amendment 1879 #
Proposal for a regulation
Article 84 – paragraph 1 a (new)
1a. The annual report shall also include the following information: (a) the number and subject matter of detection orders and removal orders to act against alleged online child sexual abuse and the number of notifications received in accordance with Article 32 and the effects given to those orders; (b) the number of notifications and requests received pursuant to Articles 8a and 35a and an overview of their follow- up; (c) information on the effectiveness of the different technologies used and on the false positive and false negative rates of those technologies, as well as statistics on appeals and the effect they have on the users of its services and information of the effectiveness of the measures and obligations under Articles 3, 4, 5 and 7. (d) information on the tools used by the provider to become aware of the reported online child sexual abuse, including data and aggregate statistics on how technologies used by the provider work.
2023/07/28
Committee: LIBE
Amendment 1883 #
Proposal for a regulation
Article 86 – paragraph 2
2. The power to adopt delegated acts referred to in Articles 3, 8, 13, 14, 17, 47 and 84 shall be conferred on the Commission for an indeterminate period of time from [date of adoption of the Regulation] period of 5 years from [date of adoption of the Regulation]. The Commission shall draw up a report in respect of the delegation of power not later than 9 months before the end of the five-year period. The delegation of power shall be tacitly extended for periods of an identical duration, unless the European Parliament or the Council opposes such extension not later than 3 months before the end of each period.
2023/07/28
Committee: LIBE
Amendment 1885 #
Proposal for a regulation
Article 89 – paragraph 3
This Regulation shall be binding in its entirety and directly applicable in all Member States. As from August 2024, if there is no entry into force of the proposed regulation, the regime in place shall be the one of the interim derogation, until such adoption is envisaged but no later than January 2025.
2023/07/28
Committee: LIBE
Amendment 1888 #
Proposal for a regulation
Annex I – title
DETECTION ORDERWARRANT ISSUED IN ACCORDANCE WITH REGULATION (EU) …/… LAYING DOWN RULES TO PREVENT AND COMBAT CHILD SEXUAL ABUSE (‘THE REGULATION’)
2023/07/28
Committee: LIBE
Amendment 1889 #
Proposal for a regulation
Annex I – Section 1 – paragraph 2 – introductory part
Name of the competent judicial authority or the independent administrative authority having issued the detection orderwarrant:
2023/07/28
Committee: LIBE
Amendment 1890 #
Proposal for a regulation
Annex I – Section 4 – paragraph 2 – point 2
2. The dissemination of new child sexual abuse material as defined in Article 2, letter (n), of the Regulationdeleted
2023/07/28
Committee: LIBE
Amendment 1893 #
Proposal for a regulation
Annex I – Section 4 – paragraph 2 – point 3
3. The solicitation of children as defined in Article 2, letter (o), of the Regulationdeleted
2023/07/28
Committee: LIBE
Amendment 1895 #
Proposal for a regulation
Annex II – title
TEMPLATE FOR INFORMATION ABOUT THE IMPOSSIBILITY TO EXECUTE THE DETECTION ORDERWARRANT referred to in Article 8(3) of Regulation (EU) .../… [laying down rules to prevent and combat child sexual abuse]
2023/07/28
Committee: LIBE
Amendment 1898 #
Proposal for a regulation
Annex III – Section 2 – point 2 – point 2
2. New child sexual abuse material, as defined in Article 2, letter (n), of the Regulationdeleted
2023/07/28
Committee: LIBE
Amendment 1900 #
Proposal for a regulation
Annex III – Section 2 – point 2 – point 3
3. Solicitation of children, as defined in Article 2, letter (o), of the Regulationdeleted
2023/07/28
Committee: LIBE
Amendment 1902 #
Proposal for a regulation
Annex III – Section 2 – point 3 – introductory part
3) Content data related to the reported potential online child sexual abuse, including images, and videos and texts, as applicable:
2023/07/28
Committee: LIBE
Amendment 1903 #
Proposal for a regulation
Annex III – Section 2 – point 4
4) Other available data related to the reported potential online child sexual abuse, including metadata related to media files (date, time, time zone): (Text – attach data as necessary)deleted
2023/07/28
Committee: LIBE
Amendment 1907 #
Proposal for a regulation
Annex VII
BLOCKING ORDER ISSUED IN ACCORDANCE WITH REGULATION (EU) …/… LAYING DOWN RULES TO PREVENT AND COMBAT CHILD SEXUAL ABUSE (‘THE REGULATION’) 1 Name of the Coordinating Authority having requested the issuance of the blocking order: (Text) Name of the competent judicial authority or the independent administrative authority having issued the blocking order: (Text) Reference of the blocking order: (Text) 2 Name of the provider and, where applicable, of its legal representative: (Text) Contact point: (Text) 3 The provider is to take the necessary measures to prevent users in the Union from having access to the known child sexual abuse material indicated by the following URLs: (Text) The blocking order applies to the following service provided by the provider in the Union: (Text) When executing the blocking order, the provider is to respect the following limits and/or to provide for the following safeguards, as referred to in Article 16(5) of the Regulation: (Text) 4 The reasons for issuing the blocking order are as follows: (Sufficiently detailed statement of reasons for issuing the blocking order) The blocking order applies from … (date) to ……. (date) The following reporting requirements apply, in accordance with Article 18(6) of the Regulation: (Text) 5 Contact details of the Coordinating Authority having requested the issuance of the order for feedback on the execution of the blocking order or further clarification, including the communications referred to in Article 17(5) of the Regulation: (Text) 6 Competent court before which the blocking order can be challenged, in accordance with Article 18(1) of the Regulation: (Text) Time periods for challenging the blocking order (days/months starting from): (Text) References or links to provisions of national law regarding redress: (Text) Where relevant, additional information regarding redress: (Text) A lack of compliance with this blocking order may result in penalties pursuant to Article 35 of the Regulation. 7 Date of issuance of the blocking order: (Text) Time stamp: (Text) Electronic signature of the competent judicial authority or independent administrative authority having issued the blocking order: (Text)deleted
2023/07/28
Committee: LIBE
Amendment 1909 #
Proposal for a regulation
Annex VIII
referred to in Article 17(5) of Regulation (EU) .../… [laying down rules to prevent and combat child sexual abuse] 1 Name of the provider and, where applicable, of its legal representative: (Text) Point of contact: (Text) Contact details of the provider and, where applicable, of its legal representative: (Text) File reference of the addressee (Text) 2 Name of the Coordinating Authority having requested the issuance of the blocking order: (Text) Competent judicial authority or independent administrative authority having issued the blocking order (Text) Reference of the blocking order (Text) Date and time of receipt of the blocking order, including time zone: (Text) 3 The provider cannot execute the blocking order within the mandatory time period for the following reasons (tick the relevant box(es)): 1. The blocking order contains one or more manifest errors 2. The blocking order does not contain sufficient information Specify the manifest error(s) and/or the further information or clarification necessary, as applicable: (Text) 4 Date and time, including time zone: (Text) Signature: (Text)deleted
2023/07/28
Committee: LIBE