BETA

95 Amendments of Birgit SIPPEL related to 2022/0155(COD)

Amendment 337 #
Proposal for a regulation
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
2023/07/28
Committee: LIBE
Amendment 358 #
Proposal for a regulation
Recital 20 a (new)
(20a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any weakening of encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be prohibiting or weakening end-to-end encryption or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 360 #
Proposal for a regulation
Recital 20 c (new)
(20c) The act of breaking encryption refers to the act of defeating or bypassing the encryption protocol used to secure a communication. Any access by any third- party that was not meant to access, read or edit the content of that communication that was supposed to be private and secure should be considered as undermining encryption.
2023/07/28
Committee: LIBE
Amendment 364 #
Proposal for a regulation
Recital 21
(21) Furthermore, as parts of those limits and safeguards, detection orderwarrants should only be issued after a diligent and objective assessment leading to the finding of a significant risk of theby a judicial authority and only with the purpose to detect known online child sexual abuse material related to a specific serdevice concerned being misused for a given type of online child sexual abuse covered by this Regulationor user account, where there is a reasonable suspicion such content is stored on that device or in that user account. One of the main elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection orderexistence of evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 368 #
Proposal for a regulation
Recital 22
(22) However, the finding of such a significant riskexistence of evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse should in itself be insufficient to justify the issuance of a detection orderwarrant, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orderwarrants can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousnactual or potential implications for the rights and legitimate interests of any potential negative consequences for other parties affecll parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Chartedr. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
2023/07/28
Committee: LIBE
Amendment 502 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of number- independent interpersonal communication services to detect and report online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 508 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point c
(c) obligations on providers of hosting services to remove or disable access to child sexual abuse material on their services;
2023/07/28
Committee: LIBE
Amendment 511 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d
(d) obligations on providers of internet access services to disable access to child sexual abuse material;deleted
2023/07/28
Committee: LIBE
Amendment 524 #
Proposal for a regulation
Article 1 – paragraph 3 – point d
(d) Regulation (EU) 2016/679, Directive 2016/680, Regulation (EU) 2018/1725, and, subject to paragraph 4 of this Article, Directive 2002/58/EC.
2023/07/28
Committee: LIBE
Amendment 529 #
Proposal for a regulation
Article 1 – paragraph 3 – point d a (new)
(da) Regulation (EU) …/… [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act);
2023/07/28
Committee: LIBE
Amendment 532 #
Proposal for a regulation
Article 1 – paragraph 3 a (new)
3a. This Regulation shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption, or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 534 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
3b. This Regulation shall not undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 539 #
Proposal for a regulation
Article 1 – paragraph 4
4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the execution of the detection orderswith the sole objective of enabling a provider of hosting services, a provider of number-independent interpersonal communications services or a provider of an artifical intelligence system to use specific technologies for the processing of personal data to the extent strictly necessary to detect and report online child sexual abuse and remove child sexual abuse material on their services, following a detection warrant issued in accordance with Section 2 of Chapter 1 of this Regulation.
2023/07/28
Committee: LIBE
Amendment 542 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
4a. This Regulation does not apply to audio communications.
2023/07/28
Committee: LIBE
Amendment 563 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iii
(iii) a software applications store;deleted
2023/07/28
Committee: LIBE
Amendment 568 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
(iva) an artificial intelligence system;
2023/07/28
Committee: LIBE
Amendment 577 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 17 years;deleted
2023/07/28
Committee: LIBE
Amendment 599 #
Proposal for a regulation
Article 2 – paragraph 1 – point s
(s) ‘content data’ means data as defined in Article 2, point 10, of Regulation (EU) … [on European Production and Preservation Orders for electronic evidence in criminal matters (…/… e-evidence Regulation)]videos and images in a digital format;
2023/07/28
Committee: LIBE
Amendment 605 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
(wa) ‘hotline’ means an organisation officially recognised by a Member State, other than the reporting channels provided by law enforcement authorities, for receiving anonymous complaints from victims and the public about alleged child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 618 #
Proposal for a regulation
Article 3 – paragraph 1 a (new)
1a. Without prejudice to Regulation (EU) 2022/2065, when conducting the risk assessment, providers of hosting services and providers of interpersonal communications services shall respect and avoid any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity, respect for private and family life, the protection of personal data, freedom of expression and information, including the freedom and pluralism of the media, the prohibition of discrimination, the rights of the child and consumer protection, as enshrined in Articles 1, 7, 8, 11, 21, 24 and 38 of the Charter respectively.
2023/07/28
Committee: LIBE
Amendment 636 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
– functionalities enabling age verification;deleted
2023/07/28
Committee: LIBE
Amendment 645 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily recognisable, accessible and, age-appropriate and child- and user friendly, including anonymous reporting channels;
2023/07/28
Committee: LIBE
Amendment 649 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
- systems and mechanisms that provide child- and user-friendly resources to ensure that children can seek help swiftly, including information on how to contact national child protection organisations or national law enforcement.
2023/07/28
Committee: LIBE
Amendment 714 #
Proposal for a regulation
Article 3 – paragraph 5
5. The risk assessment shall include an assessment of any potential remaining risk that, after taking the mitigation measures pursuant to Article 4, the service is used for the purpose of online child sexual abuse.deleted
2023/07/28
Committee: LIBE
Amendment 718 #
Proposal for a regulation
Article 3 – paragraph 6
6. The Commission, in cooperation with Coordinating Authorities, the European Data Protection Board, the Fundamental Rights Agency and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
2023/07/28
Committee: LIBE
Amendment 724 #
Proposal for a regulation
Article 4 – title
4 RSafety-by-design and risk mitigation
2023/07/28
Committee: LIBE
Amendment 763 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communicationrelevant information society services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
2023/07/28
Committee: LIBE
Amendment 767 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
(ca) reinforcing awareness-raising measures and adapting their online interface for increased user information, including child-appropriate information targeted to the risk identified;
2023/07/28
Committee: LIBE
Amendment 772 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
(cb) including clearly visible and identifiable information on the minimum age for using the service;
2023/07/28
Committee: LIBE
Amendment 819 #
Proposal for a regulation
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities and, the EU Centre, the European Data Protection Board and the Fundamental Rights Agency, and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used. The European Commission, along with the European Data Protection Board and the Fundamental Rights Agency shall issue guidelines on how providers may implement age verification and age assessment measures, in particular based on selective disclosure of attributes, with full respect for the Charter of Fundamental Rights and Regulation (EU) 2016/679 .
2023/07/28
Committee: LIBE
Amendment 823 #
Proposal for a regulation
Article 4 – paragraph 5 a (new)
5a. Prior to the deployment of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
2023/07/28
Committee: LIBE
Amendment 835 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
(a) the process and the results of the risk assessment conducted or updated pursuant to Article 3, including the assessment of any potential remaining risk referred to in Article 3(5);
2023/07/28
Committee: LIBE
Amendment 848 #
Proposal for a regulation
Article 5 – paragraph 6
6. Providers shall, upon request, transmit the report to the providers of software application stores, insofar as necessary for the assessment referred to in Article 6(2). Where necessary, they may remove confidential information from the reports.deleted
2023/07/28
Committee: LIBE
Amendment 1056 #
Proposal for a regulation
Article 8 – title
Additional rules regarding detection orderwarrants
2023/07/28
Committee: LIBE
Amendment 1061 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detection orderwarrants referred to in Article 7 using the template set out in Annex I. Detection orderwarrants shall include:
2023/07/28
Committee: LIBE
Amendment 1066 #
Proposal for a regulation
Article 8 – paragraph 1 – point a a (new)
(aa) information, with respect to each device or user account, detailing the specific purpose and scope of the warrant, including the legal basis for the reasonable suspicion.
2023/07/28
Committee: LIBE
Amendment 1072 #
Proposal for a regulation
Article 8 – paragraph 1 – point e
(e) whether the detection order issued concerns the dissemination of known or new child sexual abuse material or the solicitation of children;deleted
2023/07/28
Committee: LIBE
Amendment 1077 #
Proposal for a regulation
Article 8 – paragraph 1 – point g
(g) a sufficiently detailed statement of reasjustifications explaining why the detection orderwarrant is issued and how it is necessary, effective and proportionate;
2023/07/28
Committee: LIBE
Amendment 1105 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of hosting services and providers of number-independent interpersonal communications services that have received a detection orderwarrant, as well as users affected by the measures taken to execute it, shall have a right to information and effective redress. That right shall include the right to challenge the detection orderwarrant before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
2023/07/28
Committee: LIBE
Amendment 1145 #
Proposal for a regulation
Article 10 – paragraph 3 – point a
(a) effective in detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
2023/07/28
Committee: LIBE
Amendment 1147 #
Proposal for a regulation
Article 10 – paragraph 3 – point b
(b) not be able to extract any other information from the relevant communications than the information strictly necessary to detect, using the indicators referred to in paragraph 1, patterns pointing to the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
2023/07/28
Committee: LIBE
Amendment 1149 #
Proposal for a regulation
Article 10 – paragraph 3 – point c
(c) in accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users’ rights to private and family life, including the confidentiality of communication, and to protection of personal data. It shall not weaken or undermine end-to-end encryption and shall not limit providers of information society services from providing their services applying end-to- end encryption;
2023/07/28
Committee: LIBE
Amendment 1158 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) ensure that the interference with the fundamental right to privacy and the other rights laid down in the Charter is limited to what is strictly necessary.
2023/07/28
Committee: LIBE
Amendment 1169 #
Proposal for a regulation
Article 10 – paragraph 4 – point -a (new)
(-a) ensure privacy by design and safety-by-design and by default and, where applicable, the protection of encryption.
2023/07/28
Committee: LIBE
Amendment 1183 #
Proposal for a regulation
Article 10 – paragraph 4 – point d
(d) establish and operate an accessible, age-appropriate and user- and child- friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
2023/07/28
Committee: LIBE
Amendment 1184 #
Proposal for a regulation
Article 10 – paragraph 4 – point e
(e) inform the Coordinating Authority and competent Data Protection Authority, at the latest one month before the start date specified in the detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
2023/07/28
Committee: LIBE
Amendment 1187 #
Proposal for a regulation
Article 10 – paragraph 4 – point e a (new)
(ea) request in respect of any specific technology used for the purpose set out in this Article, a prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679, and request a prior consultation procedure as referred to in Article 36 of that Regulation have been conducted;
2023/07/28
Committee: LIBE
Amendment 1190 #
Proposal for a regulation
Article 10 – paragraph 4 a (new)
4a. in respect of any specific technology used for the purpose set out in this Article, conduct a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation;
2023/07/28
Committee: LIBE
Amendment 1192 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point a
(a) the fact that it operates technologies to detect onlineknown child sexual abuse material to execute the detection orderwarrant, the ways in which it operates those technologies and the impact on the users’ fundamental rights to private and family life, including the confidentiality of users’ communications and the protection of personal data;
2023/07/28
Committee: LIBE
Amendment 1215 #
Proposal for a regulation
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of number- independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potentiallleged online child sexual abuse on its services, it shall promptly report, without delay, that abuse to the competent law enforcement and independent judicial authorities and submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
2023/07/28
Committee: LIBE
Amendment 1243 #
Proposal for a regulation
Article 13 – paragraph 1 – point c a (new)
(ca) where applicable, an exact uniform resource locator and, where necessary, additional information for the identification of the child sexual abuse material;
2023/07/28
Committee: LIBE
Amendment 1246 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) all available data other than content data related to the potential online child sexual abuse;deleted
2023/07/28
Committee: LIBE
Amendment 1253 #
Proposal for a regulation
Article 13 – paragraph 1 – point f
(f) information concerning the geographic location related to the potential online child sexual abuse, such as the Internet Protocol address;deleted
2023/07/28
Committee: LIBE
Amendment 1273 #
Proposal for a regulation
Article 14 – paragraph 2
2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
2023/07/28
Committee: LIBE
Amendment 1286 #
1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the removal order.
2023/07/28
Committee: LIBE
Amendment 1293 #
Proposal for a regulation
Chapter II – Section 5
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1297 #
Proposal for a regulation
Article 16
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1312 #
Proposal for a regulation
Article 17
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1321 #
Proposal for a regulation
Article 18
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1359 #
Proposal for a regulation
Article 21 – paragraph 1
1. Providers of hostingrelevant information society services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
2023/07/28
Committee: LIBE
Amendment 1367 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
2023/07/28
Committee: LIBE
Amendment 1374 #
Proposal for a regulation
Article 21 a (new)
Article21a Right to lodge a complaint with a supervisory authority 1. Without prejudice to any other administrative or judicial remedy, every user shall have the right to lodge a complaint with a supervisory authority, in particular in the Member State of his or her habitual residence, place of work or place of the alleged infringement if the user considers that the processing of personal data relating to him or her infringes this Regulation or Regulation (EU) 2016/679. 2. The supervisory authority with which the complaint has been lodged shall inform the complainant on the progress and the outcome of the complaint including the possibility of a judicial remedy pursuant to Article 21b.
2023/07/28
Committee: LIBE
Amendment 1375 #
Proposal for a regulation
Article 21 b (new)
Article21b Right to an effective judicial remedy against a provider of a hosting services or a providers of a number-independent interpersonal communications service 1. Without prejudice to any available administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority pursuant to 21a, each user shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation or Regulation (EU) 2016/679. 2. Proceedings against a provider of a hosting service or a provider of a number- independent interpersonal communications service shall be brought before the courts of the Member State where the provider has an establishment. Alternatively, such proceedings may be brought before the courts of the Member State where the user has his or her habitual residence.
2023/07/28
Committee: LIBE
Amendment 1377 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Providers of hosting services and providers of number-independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
2023/07/28
Committee: LIBE
Amendment 1384 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 2
As regards the first subparagraph, point (a), the provider may also preserve the information for the purpose of improving the effectiveness and accuracy of the technologies to detect online child sexual abuse for the execution of a detection order issued to it in accordance with Article 7. However, it shall not store any personal data for that purpose.deleted
2023/07/28
Committee: LIBE
Amendment 1478 #
Proposal for a regulation
Article 35 – paragraph 4 a (new)
4a. Member States shall ensure that penalties imposed for the infringement of this Regulation do not encourage the over reporting or the removal of material which does not constitute child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1479 #
Proposal for a regulation
Article 35 a (new)
Article35a Compensation Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to seek, in accordance with Union and national law, compensation from providers of relevant information society services, for any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.
2023/07/28
Committee: LIBE
Amendment 1514 #
Proposal for a regulation
Article 38 – paragraph 2 a (new)
2a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of online child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to remove content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
2023/07/28
Committee: LIBE
Amendment 1525 #
Proposal for a regulation
Article 39 – paragraph 3 a (new)
3a. Where the EU Centre receives a report from a hotline, or where a provider that submitted the report to the EU Centre has indicated that the report is based on the information received from a hotline, the EU Centre shall coordinate with the relevant Coordinating Authorities in order to avoid duplicated reporting on the same material that has already been reported to the national law enforcement authorities by the hotlines, and monitor the removal of the child sexual abuse material or cooperate with the relevant hotline to track the status.
2023/07/28
Committee: LIBE
Amendment 1577 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) providing technical expertise and promoting the exchange of best practices among Member States on raising awareness for the prevention of child sexual abuse online in formal and non- formal education. Such efforts shall be age-appropriate and gender-sensitive;
2023/07/28
Committee: LIBE
Amendment 1582 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b b (new)
(bb) exchanging best practices among Coordinating Authorities regarding the available tools to reduce the risk of children becoming victims of sexual abuse and to provide specialized assistance to survivors, in an age-appropriate and gender-sensitive way.
2023/07/28
Committee: LIBE
Amendment 1603 #
Proposal for a regulation
Article 44 – paragraph 1 – point b
(b) indicators to detect the dissemination of child sexual abuse material not previously detected and identified as constituting child sexual abuse material in accordance with Article 36(1);deleted
2023/07/28
Committee: LIBE
Amendment 1606 #
Proposal for a regulation
Article 44 – paragraph 1 – point c
(c) indicators to detect the solicitation of children.deleted
2023/07/28
Committee: LIBE
Amendment 1716 #
Proposal for a regulation
Article 50 – paragraph 5
5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. Communication campaigns shall be easily understandable and accessible to all children, their families and educators in formal, and non-formal education in the Union, aiming to improve digital literacy and ensure a safe digital environment for children. Communication campaigns shall take into account the gender dimension of the crime.
2023/07/28
Committee: LIBE
Amendment 1742 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 1
Europol and the EU Centre shall provide each other with the fullest possible access to relevant information and information systems, where necessary for the performance of their respective tasks and in accordance with the acts of Union law regulating such access.deleted
2023/07/28
Committee: LIBE
Amendment 1745 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 2
Without prejudice to the responsibilities of the Executive Director, the EU Centre shall maximise efficiency by sharing administrative functions with Europol, including functions relating to personnel management, information technology (IT) and budget implementation.deleted
2023/07/28
Committee: LIBE
Amendment 1753 #
Proposal for a regulation
Article 53 – paragraph 3
3. The terms of cooperation and working arrangements shall be laid down in a publically accessible memorandum of understanding.
2023/07/28
Committee: LIBE
Amendment 1762 #
Proposal for a regulation
Article 55 – paragraph 1 – introductory part
The administrative and management structure of the EU Centre shall be gender- balanced and comprise:
2023/07/28
Committee: LIBE
Amendment 1764 #
Proposal for a regulation
Article 55 – paragraph 1 – point d a (new)
(da) a Fundamental Rights Officer, which shall exercise the tasks set out in Article 66b;
2023/07/28
Committee: LIBE
Amendment 1765 #
Proposal for a regulation
Article 55 – paragraph 1 – point d b (new)
(db) an Expert's Consultative Forum, which shall exercise the tasks set out in Article 66a;
2023/07/28
Committee: LIBE
Amendment 1767 #
Proposal for a regulation
Article 56 – paragraph 1
1. The Management Board shall be gender-balanced and composed of one representative from each Member State and two representatives of the Commission, all as members with voting rights.
2023/07/28
Committee: LIBE
Amendment 1780 #
Proposal for a regulation
Article 57 – paragraph 1 – point f a (new)
(fa) appoint a Data Protection Officer;
2023/07/28
Committee: LIBE
Amendment 1781 #
Proposal for a regulation
Article 57 – paragraph 1 – point f b (new)
(fb) appoint a Fundamental Rights Officer;
2023/07/28
Committee: LIBE
Amendment 1806 #
Proposal for a regulation
Article 66 a (new)
Article66a Establishment and tasks of the Expert's Consultative Forum 1. The EU Centre shall establish a Consultative Forum to assist it by providing it with independent advice on survivors related matters. The Consultative Forum shall act upon request of the Management Board or the Executive Director. 2. The Consultative Forum shall consist of a maximum of fifteen members. Members of the Consultative Forum shall, in an equal matter, be appointed from child survivors and parents of child survivors, as well as representatives of organizations acting in the public interest, including: (a) organizations representing or promoting rights of the LGBTQIA+ community, specifically minors; (b) organizations representing or promoting children's rights; (b) organizations representing or promoting child survivors rights; (c) organizations representing or promoting digital rights They shall be appointed by the Management Board following the publication of a call for expression of interest in the Official Journal of the European Union. 3. The mandate of members of the Consultative Forum shall be of four years. Those mandates shall be renewable once. 4. The Consultative Forum shall: a) provide the Management Board and the Executive Director with advice on matters related to survivors; b) provide the Management Board, the Executive Director and the Technology Committee with advice on preventive measures for relevant information society services; c) contribute to the EU Centre communication strategy referred to in Article 50(5); d) provide its opinion on the proportionality of technologies used to detect known child sexual abuse; e) maintain an open dialogue with the Management Board and the Executive Director on all matters related to survivors, particularly on the protection of survivors’ rights and digital rights.
2023/07/28
Committee: LIBE
Amendment 1807 #
Proposal for a regulation
Chapter IV – Section 5 – Part 3 a (new)
3a Part 3 a (new): Fundamental Rights Protection Article 66b Fundamental rights officer 1. A fundamental rights officer shall be appointed by the management board on the basis of a list of three candidates, after consultation with the Expert's Consultative Forum. The fundamental rights officer shall have the necessary qualifications, expert knowledge and professional experience in the field of fundamental rights. 2. The fundamental rights officer shall perform the following tasks: (a) contributing to the Centre's fundamental rights strategy and the corresponding action plan, including by issuing recommendations for improving them; (b) monitoring the Centre's compliance with fundamental rights, including by conducting investigations into any of its activities; (c) promoting the Centre's respect of fundamental rights; (d) advising the Centre where he or she deems it necessary or where requested on any activity of the Centre without dagelaying those activities; (e) providing opinions on working arrangements; (f) providing the secretariat of the consultative forum; (g) informing the management board and executive director about possible violations of fundamental rights during activities of the Centre; (h) performing any other tasks, where provided for by this Regulation. 3. The Management Board shall lay down special rules applicable to the fundamental rights officer in order to guarantee that the fundamental rights officer and his or her staff are independent in the performance of their duties. The fundamental rights officer shall report directly to the Management Board and shall cooperate with the Technology Committee. The management board shall ensure that action is taken with regard to recommendations of the fundamental rights officer. In addition, the fundamental rights officer shall publish annual reports on his or her activities and on the extent to which the activities of the Centre respect fundamental rights. Those reports shall include information on the complaints mechanism and the implementation of the fundamental rights strategy. 4. The Centre shall ensure that the fundamental rights officer is able to act autonomously and is able to be independent in the conduct of his or her duties. The fundamental rights officer shall have sufficient and adequate human and financial resources at his or her disposal necessary for the fulfilment of his or her tasks. The fundamental rights officer shall select his or her staff, and that staff shall only report to him or her. 5. The fundamental rights officer shall be assisted by a deputy fundamental rights officer. The deputy fundamental rights officer shall be appointed by the management board from a list of at least three candidates presented by the fundamental rights officer. The deputy fundamental rights officer shall have the necessary qualifications and experience in the field of fundamental rights and shall be independent in the conduct of his or her duties. If the fundamental rights officer is absent or indisposed, the deputy fundamental rights officer shall assume the fundamental rights officer's duties and responsibilities. 6. The fundamental rights officer shall have access to all information concerning respect for fundamental rights in all the activities of the Centre. Article 66c Complaints mechanism 1. The Centre shall, in cooperation with the fundamental rights officer, take the necessary measures to set up and further develop an independent and effective complaints mechanism in accordance with this Article to monitor and ensure respect for fundamental rights in all the activities of the Centre. 2. Any person who is directly affected by the actions or failure to act on the part of staff involved in a joint operation, pilot project, or an operational activity of the Centre, and who considers himself or herself to have been the subject of a breach of his or her fundamental rights due to those actions or that failure to act, or any party representing such a person, may submit a complaint in writing to the Centre. 3. The fundamental rights officer shall be responsible for handling complaints received by the Centre in accordance with the right to good administration. For that purpose, the fundamental rights officer shall review the admissibility of a complaint, register admissible complaints, forward all registered complaints to the executive director and forward complaints concerning members of the teams to the relevant authority or body competent for fundamental rights for further action in accordance with their mandate. The fundamental rights officer shall also register and ensure the follow-up by the Centre or that authority or body. 4. In accordance with the right to good administration, if a complaint is admissible, complainants shall be informed that the complaint has been registered, that an assessment has been initiated and that a response may be expected as soon as it becomes available. If a complaint is forwarded to national authorities or bodies, the complainant shall be provided with their contact details. If a complaint is declared inadmissible, the complainant shall be informed of the reasons and, if possible, provided with further options for addressing their concerns. The Centre shall provide for an appropriate procedure in cases where a complaint is declared inadmissible or unfounded. Any decision shall be in written form and reasoned. The fundamental rights officer shall reassess the complaint if the complainant submits new evidence in situations where the complaint has been declared inadmissible or unfounded. 5. In the case of a registered complaint concerning a staff member of the Centre, the fundamental rights officer shall recommend appropriate follow-up, including disciplinary measures, to the executive director and, where appropriate, referral for the initiation of civil or criminal justice proceedings in accordance with this Regulation and national law. The executive director shall ensure the appropriate follow-up and shall report back to the fundamental rights officer within a determined timeframe and, if necessary, at regular intervals thereafter, as to the findings, the implementation of disciplinary measures, and follow-up by the Centre in response to a complaint. If a complaint is related to data protection issues, the executive director shall consult the data protection officer of the Centre before taking a decision on the complaint. The fundamental rights officer and the data protection officer shall establish, in writing, a memorandum of understanding specifying their division of tasks and cooperation as regards complaints received. 6. The fundamental rights officer shall include information on the complaints mechanism in his or her annual report, as referred to in Article 66a, including specific references to the Centre's findings and the follow-up to complaints. 7. The fundamental rights officer shall, in accordance with paragraphs 1 to 9 and after consulting the experts council, draw up a standardised complaint form requiring detailed and specific information concerning the alleged breach of fundamental rights. The fundamental rights officer shall also draw up any further detailed rules as necessary. The fundamental rights officer shall submit that form and such further detailed rules to the executive director and to the management board. The Centre shall ensure that information about the possibility and procedure for making a complaint is readily available, including for vulnerable persons. The standardised complaint form shall be made available on the Centre's website and in hardcopy during all activities of the Centre in languages that third-country nationals understand or are reasonably believed to understand. The standardised complaint form shall be easily accessible, including on mobile devices. The Centre shall ensure that further guidance and assistance on the complaints procedure is provided to complainants. Complaints shall be considered by the fundamental rights officer even when they have not been submitted in the standardised complaint form. 8. Any personal data contained in a complaint shall be handled and processed by the Centre, including the fundamental rights officer, in accordance with Regulation (EU) 2018/1725. Where a complainant submits a complaint, that complainant shall be understood to consent to the processing of his or her personal data by the Centre and the fundamental rights officer within the meaning of point (d) of Article 5(1) of Regulation (EU) 2018/1725. In order to safeguard the interests of the complainants, complaints shall be dealt with confidentially by the fundamental rights officer in accordance with national and Union law unless the complainant explicitly waives his or her right to confidentiality. When complainants waive their right to confidentiality, it shall be understood that they consent to the fundamental rights officer or the Centre disclosing their identity to the competent authorities or bodies in relation to the matter under complaint, where necessary.
2023/07/28
Committee: LIBE
Amendment 1888 #
Proposal for a regulation
Annex I – title
DETECTION ORDERWARRANT ISSUED IN ACCORDANCE WITH REGULATION (EU) …/… LAYING DOWN RULES TO PREVENT AND COMBAT CHILD SEXUAL ABUSE (‘THE REGULATION’)
2023/07/28
Committee: LIBE
Amendment 1889 #
Proposal for a regulation
Annex I – Section 1 – paragraph 2 – introductory part
Name of the competent judicial authority or the independent administrative authority having issued the detection orderwarrant:
2023/07/28
Committee: LIBE
Amendment 1890 #
Proposal for a regulation
Annex I – Section 4 – paragraph 2 – point 2
2. The dissemination of new child sexual abuse material as defined in Article 2, letter (n), of the Regulationdeleted
2023/07/28
Committee: LIBE
Amendment 1893 #
Proposal for a regulation
Annex I – Section 4 – paragraph 2 – point 3
3. The solicitation of children as defined in Article 2, letter (o), of the Regulationdeleted
2023/07/28
Committee: LIBE
Amendment 1895 #
Proposal for a regulation
Annex II – title
TEMPLATE FOR INFORMATION ABOUT THE IMPOSSIBILITY TO EXECUTE THE DETECTION ORDERWARRANT referred to in Article 8(3) of Regulation (EU) .../… [laying down rules to prevent and combat child sexual abuse]
2023/07/28
Committee: LIBE
Amendment 1898 #
Proposal for a regulation
Annex III – Section 2 – point 2 – point 2
2. New child sexual abuse material, as defined in Article 2, letter (n), of the Regulationdeleted
2023/07/28
Committee: LIBE
Amendment 1902 #
Proposal for a regulation
Annex III – Section 2 – point 3 – introductory part
3) Content data related to the reported potential online child sexual abuse, including images, and videos and texts, as applicable:
2023/07/28
Committee: LIBE
Amendment 1903 #
Proposal for a regulation
Annex III – Section 2 – point 4
4) Other available data related to the reported potential online child sexual abuse, including metadata related to media files (date, time, time zone): (Text – attach data as necessary)deleted
2023/07/28
Committee: LIBE
Amendment 1907 #
Proposal for a regulation
Annex VII
BLOCKING ORDER ISSUED IN ACCORDANCE WITH REGULATION (EU) …/… LAYING DOWN RULES TO PREVENT AND COMBAT CHILD SEXUAL ABUSE (‘THE REGULATION’) 1 Name of the Coordinating Authority having requested the issuance of the blocking order: (Text) Name of the competent judicial authority or the independent administrative authority having issued the blocking order: (Text) Reference of the blocking order: (Text) 2 Name of the provider and, where applicable, of its legal representative: (Text) Contact point: (Text) 3 The provider is to take the necessary measures to prevent users in the Union from having access to the known child sexual abuse material indicated by the following URLs: (Text) The blocking order applies to the following service provided by the provider in the Union: (Text) When executing the blocking order, the provider is to respect the following limits and/or to provide for the following safeguards, as referred to in Article 16(5) of the Regulation: (Text) 4 The reasons for issuing the blocking order are as follows: (Sufficiently detailed statement of reasons for issuing the blocking order) The blocking order applies from … (date) to ……. (date) The following reporting requirements apply, in accordance with Article 18(6) of the Regulation: (Text) 5 Contact details of the Coordinating Authority having requested the issuance of the order for feedback on the execution of the blocking order or further clarification, including the communications referred to in Article 17(5) of the Regulation: (Text) 6 Competent court before which the blocking order can be challenged, in accordance with Article 18(1) of the Regulation: (Text) Time periods for challenging the blocking order (days/months starting from): (Text) References or links to provisions of national law regarding redress: (Text) Where relevant, additional information regarding redress: (Text) A lack of compliance with this blocking order may result in penalties pursuant to Article 35 of the Regulation. 7 Date of issuance of the blocking order: (Text) Time stamp: (Text) Electronic signature of the competent judicial authority or independent administrative authority having issued the blocking order: (Text)deleted
2023/07/28
Committee: LIBE
Amendment 1909 #
Proposal for a regulation
Annex VIII
referred to in Article 17(5) of Regulation (EU) .../… [laying down rules to prevent and combat child sexual abuse] 1 Name of the provider and, where applicable, of its legal representative: (Text) Point of contact: (Text) Contact details of the provider and, where applicable, of its legal representative: (Text) File reference of the addressee (Text) 2 Name of the Coordinating Authority having requested the issuance of the blocking order: (Text) Competent judicial authority or independent administrative authority having issued the blocking order (Text) Reference of the blocking order (Text) Date and time of receipt of the blocking order, including time zone: (Text) 3 The provider cannot execute the blocking order within the mandatory time period for the following reasons (tick the relevant box(es)): 1. The blocking order contains one or more manifest errors 2. The blocking order does not contain sufficient information Specify the manifest error(s) and/or the further information or clarification necessary, as applicable: (Text) 4 Date and time, including time zone: (Text) Signature: (Text)deleted
2023/07/28
Committee: LIBE