BETA

Activities of Patrick BREYER related to 2022/0155(COD)

Shadow reports (1)

REPORT on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
2023/11/16
Committee: LIBE
Dossiers: 2022/0155(COD)
Documents: PDF(1 MB) DOC(555 KB)
Authors: [{'name': 'Javier ZARZALEJOS', 'mepid': 197606}]

Amendments (413)

Amendment 281 #
Proposal for a regulation
Title 1
Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combataddress online child sexual abuse (Text with EEA relevance)
2023/07/28
Committee: LIBE
Amendment 290 #
Proposal for a regulation
Recital 2
(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and to help combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, effective, evidence- based, proportionate, and subject to constant review, so as to avoid any undue negative consequences for the fight against crime and for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid directly or indirectly imposing any excessive burdens on the providers of the services.
2023/07/28
Committee: LIBE
Amendment 298 #
Proposal for a regulation
Recital 3
(3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, may have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
2023/07/28
Committee: LIBE
Amendment 303 #
Proposal for a regulation
Recital 4
(4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is demonstrably and durably effective and that respects the fundamental rights of all parties concerned. In view of the fast- changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future- proof manner, so as not to hamper innovationthe fight against crime.
2023/07/28
Committee: LIBE
Amendment 310 #
Proposal for a regulation
Recital 5
(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misusedare misused to a significant extent for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services shcould include publicly available number-independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those services asre publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equallyalso at risk of misuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner.
2023/07/28
Committee: LIBE
Amendment 313 #
Proposal for a regulation
Recital 6
(6) Online child sexual abuse frequentlycan also involves the misuse of information society services offered in the Union by providers established in third countries. In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to all providers, irrespective of their place of establishment or residence, that offer services in the Union, as evidenced by a substantial connection to the Union.
2023/07/28
Committee: LIBE
Amendment 314 #
Proposal for a regulation
Recital 7
(7) This Regulation should be without prejudice to the rules resulting from other Union acts, in particular Directive 2011/93 of the European Parliament and of the Council38, Directive 2000/31/EC of the European Parliament and of the Council39and Regulation (EU) …/…2022/2065of the European Parliament and of the Council40[on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC], Directive 2010/13/EU of the European Parliament and of the Council41, Regulation (EU) 2016/679 of the European Parliament and of the Council42, and Directive 2002/58/EC of the European Parliament and of the Council43. (This amendment applies throughout the text. Adopting it will necessitate corresponding changes throughout.) _________________ 38 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). 39 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). 40 Regulation (EU) …/…2022/2065 of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (OJ L ….). 41 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media service (OJ L 95, 15.4.2010, p. 1). 42 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (OJ L 119, 4.5.2016, p. 1). 43 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (‘Directive on privacy and electronic communications’) (OJ L 201, 31.7.2002, p. 37).
2023/07/28
Committee: LIBE
Amendment 316 #
Proposal for a regulation
Recital 9
(9) Article 15(1) of Directive 2002/58/EC allows Member States to adopt legislative measures to restrict the scope of the rights and obligations provided for in certain specific provisions of that Directive relating to the confidentiality of communications when such restriction constitutes a necessary, appropriate and proportionate measure within a democratic society, inter alia, to prevent, investigate, detect and prosecute criminal offences, provided certain conditions are met, including compliance with the Charter, which, inter alia, requires the specific measures to be provided for by law and genuinely achieve objectives of general interest. Applying the requirements of that provision by analogy, this Regulation should limit the exercise of the rights and obligations provided for in Articles 5(1), (3) and 6(1) of Directive 2002/58/EC, insofar as strictly necessary in line with Article 52 of the Charter, to execute detection orders issued in accordance with this Regulation with a view to prevent and combat online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 319 #
Proposal for a regulation
Recital 11
(11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number, in relation to population size, of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of a software application in the relevant national software application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1), point (c), of Regulation (EU) 1215/2012 of the European Parliament and of the Council44. Mere technical accessibility of a website from the Union should cannot, alone, be considered as establishing a substantial connection to the Union. _________________ 44 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).
2023/07/28
Committee: LIBE
Amendment 324 #
Proposal for a regulation
Recital 14
(14) With a view to minimising the risk that their services are misused for the dissemination of known or new child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available number- independent interpersonal communications services should assess suchthe existence of a recurring systemic risk for each of the services that they offer in the Union. To guide their risk assessment, a non- exhaustive list of elements to be taken into account should be provided. To allow for a full consideration of the specific characteristics of the services they offer, providers should be allowed to take account of additional elements where relevant. As risks evolve over time, in function of developments such as those related to technology and the manners in which the services in question are offered and used, it is appropriate to ensure that the risk assessment is, as well as the effectiveness and proportionality of mitigation measures, are updated regularly and when needed for particular reasons.
2023/07/28
Committee: LIBE
Amendment 330 #
Proposal for a regulation
Recital 15
(15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] with respect to information that they store and disseminate to the public. For the purposes of the present Regulation, those providers may draw on such a, which should form the basis for the risk assessment aund complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as required by this Regulationer this instrument. .
2023/07/28
Committee: LIBE
Amendment 334 #
Proposal for a regulation
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable specific measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulationshould address the risk identified in the specific risk assessment.
2023/07/28
Committee: LIBE
Amendment 343 #
Proposal for a regulation
Recital 17
(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigationspecific measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identifieds, exposure and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect online child sexual abuse . Specific measures could include providing their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authorityechnical measures and tools that allow users to manage their own privacy visibility, reachability and safety, such as mechanisms for users to block or mute other users, mechanisms that ask for confirmation before displaying certain content, tools that prompt or warn users.
2023/07/28
Committee: LIBE
Amendment 345 #
(17a) While age verification tools may be one possible method of mitigating risk, many currently-known age verification methods create a risk of systemic violations of privacy and data protection. This includes, inter alia, the mass profiling of the users, the biometric analysis of the user’s face and/or voice, or the deployment of digital identification/certification system, none of which currently respects individuals’ fundamental rights sufficiently to justify its large-scale or mandatory deployment. Implementation of any of these measures by the providers of communication services would necessarily add another layer of interference with the rights and freedoms of the users, or unduly restrict access to services to people who appear younger or older than their actual age or people who do not have the necessary identification documents. As such, methods to verify or assess the age of users should not be mandatory, if used, be approached with caution and allow for alternatives, to ensure the protection of rights to privacy and data protection of all internet users in line with the GDPR, and to ensure that it remains possible for law- abiding internet users to remain anonymous.
2023/07/28
Committee: LIBE
Amendment 346 #
Proposal for a regulation
Recital 17 b (new)
(17b) Relying on providers for risk mitigation measures comes with inherent risks, as business models, technologies and crimes evolve continuously. As a result, clear targets, oversight, review and adaptation, led by national supervisory authorities are needed, to avoid measures becoming redundant, disproportionate, ineffective, counterproductive and outdated.
2023/07/28
Committee: LIBE
Amendment 347 #
Proposal for a regulation
Recital 18
(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigationspecific measures. Therefore, providers of hosting services and providers of publicly available number-independent interpersonal communications services should, when designing and implementing the mitigationspecific measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigationspecific measures should reasonably be taken in a given situation, account should also be taken of the ongoing effectiveness of the measures, the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigationspecific measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices, including as established through self-regulatory cooperation, and those contained in guidelines from the Commission. When no risk has been detected after a diligently conducted or updated risk assessment, providers should not be required to take any mitigation measuresquantifying the expected impact of the available measures. Objective data on ongoing effectiveness must be provided, in order for any measure to be recognised as best practice.
2023/07/28
Committee: LIBE
Amendment 350 #
Proposal for a regulation
Recital 19
(19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable measures to assess and mitigate that risk. The providers should make that assessment in a diligent manner, making efforts that are reasonable under the given circumstances, having regard inter alia to the nature and extent of that risk as well as their financial and technological capabilities and size, and cooperating with the providers of the services offered through the software application where possible.deleted
2023/07/28
Committee: LIBE
Amendment 355 #
Proposal for a regulation
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders targeting suspects. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of targets, limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.
2023/07/28
Committee: LIBE
Amendment 363 #
Proposal for a regulation
Recital 21
(21) Furthermore, as parts of those limits and safeguards, detection orders should only be issued after a diligent and objective assessment leading to the finding of a significant risk of the specific service concerned being misused for a given type of online child sexual abuse covered by this Regulation. One of the elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection orderrequire a reasonable suspicion of the service being used for the purpose of online child sexual abuse by one or more suspects.
2023/07/28
Committee: LIBE
Amendment 369 #
Proposal for a regulation
Recital 22
(22) However, the finding of such a significant riskevidence should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the specific results anticipated by the measure, the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
2023/07/28
Committee: LIBE
Amendment 372 #
Proposal for a regulation
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted, has quantifiable targets, is limited in time and is specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively addressnd demonstrably mitigate the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditinguser or users. Safeguards additional to the ones already expressly specified in this Regulation should in addition be specified. Independent auditing, in particular of the achievement of the anticipated results, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessaryshould be prerequisites of such orders being made. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
2023/07/28
Committee: LIBE
Amendment 376 #
Proposal for a regulation
Recital 24
(24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, shouldshould have the data necessary to be in a position to take a well-informed decision on requests for the issuance of detections orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especially in connection to detection orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question, as per Article 35 and 36 of that Regulation. They should do so as soon as possiblewithout undue delay, having regard to the important public policy objective at stake and the need to act without undue delay to to protect children. In particularFurthermore, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for providing their opinions in response to a prior consultation. Furthermore, they should normally be able to provide their opinion well within that time periodin a timely manner including in situations where the European Data Protection Board has already issued guidelines regarding the technologies that a provider envisages deploying and operating to execute a detection order addressed to it under this Regulation.
2023/07/28
Committee: LIBE
Amendment 377 #
Proposal for a regulation
Recital 25
(25) Where new services are concerned, that is, services not previously offered in the Union, the evidence available on the potential misuse of the service in the last 12 months is normally non-existent. Taking this into account, and to ensure the effectiveness of this Regulation, the Coordinating Authority should be able to draw on evidence stemming from comparable services when assessing whether to request the issuance of a detection order in respect of such a new service. A service should be considered comparable where it provides a functional equivalent to the service in question, having regard to all relevant facts and circumstances, in particular its main characteristics and functionalities, the manner in which it is offered and used, the user base, the applicable terms and conditions and risk mitigation measures, as well as the overall remaining risk profile.deleted
2023/07/28
Committee: LIBE
Amendment 380 #
Proposal for a regulation
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulationare not undermined. That includes the use of end- to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them orAny weakening of encryption could potentially be abused by malicious their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of usersd parties. Nothing in this Regulation should therefore be interpreted as prohibiting, circumventing or weakening end-to-end encryption.
2023/07/28
Committee: LIBE
Amendment 388 #
Proposal for a regulation
Recital 26 a (new)
(26a) Encryption is important to ensure the enjoyment of all human rights offline and online. Moreover, encryption technologies contribute in a fundamental way both to the respect for private life and confidentiality of communications, as well as to innovation and the growth of the digital economy, which relies on the high level of trust and confidence that such technologies provide. In the context of interpersonal communications, end-to- end encryption (‘E2EE’) is a crucial tool for ensuring the confidentiality of electronic communications, as it provides strong technical safeguards against access to the content of the communications by anyone other than the sender and the recipient(s), including by the provider. It should be noted that while E2EE is one of the most commonly used security measures in the context of electronic communications, other technical solutions (e.g., the use of other cryptographic schemes) might be or become equally important to secure and protect the confidentiality of digital communications. Thus, their use should not be prevented, circumvented or weakened either.
2023/07/28
Committee: LIBE
Amendment 393 #
Proposal for a regulation
Recital 26 b (new)
(26b) The principle of data protection by design and by default laid down in Article 25 of Regulation (EU) 2016/679 applies to the technologies regulated by the Proposal by virtue of law.
2023/07/28
Committee: LIBE
Amendment 395 #
Proposal for a regulation
Recital 27
(27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers detection technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detection orders addressed to them. The European Data Protection Board should be consulted on the acceptability or otherwise of those technologies and the ways in which they should be best deployed to ensure, if at all, in compliance with applicable rules of Union law on the protection of personal data. The adviceuthoritative position of the European Data Protection Board should be fully taken into account by the EU Centre when compiling the lists of available technologies and also by the Commission when preparing guidelines regarding the application of the detection obligations. The providers may operate the technologies made available by the EU Centre or by others or technologies that they developed themselves, as long as they meet the requirements of this Regulation and other applicable EU law, such as Regulation 2016/679. These technologies should be independently audited as regards their performance and reliability, and the benchmarks used as well as the results of the independent audit shall be made public.
2023/07/28
Committee: LIBE
Amendment 398 #
Proposal for a regulation
Recital 27 a (new)
(27a) Since the consultation of the EDPB by the EU Center is a new task not foreseen under either Regulation 2016/679, Regulation 2018/1725 or Directive 2016/680, the EDPB budget and staffing should be adapted accordingly. The situation of national authorities, who too will be regularly consulted by service providers, should also reflect their increased responsibilities.
2023/07/28
Committee: LIBE
Amendment 403 #
Proposal for a regulation
Recital 28
(28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliablaccurate, as well as to identify false positives and avoid to the extentfalse negatives and to avoid erroneous reporting to the EU Centre, providerslaw enforcement should ensure adequate human oversight and, where necessary, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential solicitation of children forwarded by providers.
2023/07/28
Committee: LIBE
Amendment 410 #
Proposal for a regulation
Recital 29
(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that theyThe information providers may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potential online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonable grounds to believe that a particular activity may constitute online child sexual abuse. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest againstIn the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. However, nothing in this Regulation should be interpreted as providing for a legal basis for the processing of personal data for the sole purpose of detecting online child sexual abuse, orn activities conducted on the providers’ own initiative voluntary basis where a detection order has not been issued. Those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them.
2023/07/28
Committee: LIBE
Amendment 414 #
Proposal for a regulation
Recital 30
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences. Users should be notified in any case whenever a report concerning them is submitted by the EU Center to a competent national authority, in order to be able to exercise their right of redress.
2023/07/28
Committee: LIBE
Amendment 419 #
Proposal for a regulation
Recital 32
(32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited. For legal and pAs every country in the world has ractical reasons, it may not be reasonably possible to have those providers remove or disable access to the material, not even through cooperation with the competent authorities of the third country where they are established. Therefore, in line with existing practices in several Member States, it should be possible to require providers of internet access services to take reasonable measures to block the access of users in the Union to the materialfied either the UN Convention on the Rights of the Child or its optional Protocol on Child Pornography, it should always be possible to have those providers remove or disable access to the material. Where problems arise in relation to specific jurisdictions, all possible diplomatic pressure should be brought to bear by the Commission and Member States to remedy the situation.
2023/07/28
Committee: LIBE
Amendment 420 #
Proposal for a regulation
Recital 33
(33) In the interest of consistency, efficiency and effectiveness and to minimise the risk of circumvention, such blocking orders should be based on the list of uniform resource locators, leading to specific items of verified child sexual abuse, compiled and provided centrally by the EU Centre on the basis of diligently verified submissions by the relevant authorities of the Member States. In order to avoid the taking of unjustified or disproportionate measures, especially those that would unduly affect the fundamental rights at stake, notably, in addition to the rights of the children, the users’ freedom of expression and information and the providers’ freedom to conduct a business, appropriate limits and safeguards should be provided for. In particular, it should be ensured that the burdens imposed on the providers of internet access services concerned are not unreasonable, that the need for and proportionality of the blocking orders is diligently assessed also after their issuance and that both the providers and the users affected have effective means of judicial as well as non-judicial redress.deleted
2023/07/28
Committee: LIBE
Amendment 424 #
Proposal for a regulation
Recital 35
(35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims or their approved formal representative should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported or has been removed by providers of hosting services or providers of publicly available number-independent interpersonal communications services in accordance with this Regulation.
2023/07/28
Committee: LIBE
Amendment 426 #
Proposal for a regulation
Recital 35 a (new)
(35a) As pointed out in the Commission Strategy 1a, children themselves need to have the knowledge and tools that could help them not to be confronted with the abuse when possible and they need to be informed that certain behaviours are not acceptable. The Commission-funded network of Safer Internet Centres raises awareness on online safety and provides information, resources and assistance via helplines and hotlines on a wide range of digital safety topics including grooming and sexting. The One in Five campaign by the Council of Europe and Europol’s “#SayNo” initiative are further examples of how this can be done. When abuse occurs, children need to feel secure and empowered to speak up, react and report, even when the abuse comes from within their circle of trust, as it is often the case. They also need to have access to safe, accessible and age-appropriate channels to report the abuse without fear. As stated in the Recommendation of the UN Committee on the Rights of the Child 1b, state parties should ensure that digital literacy is taught in schools, as part of basic education curricula, from the preschool level and throughout all school years, and that such pedagogies are assessed on the basis of their results. Curricula should include the knowledge and skills to safely handle a wide range of digital tools and resources, including those relating to content, creation, collaboration, participation, socialization and civic engagement EU strategy for a more effective fight against child sexual abuse. _________________ 1a COM(2020) 607 final, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions EU strategy for a more effective fight against child sexual abuse 1b CRC/C/GC/25 , General comment No. 25 (2021) on children’s rights in relation to the digital environment from UN Committee on the Rights of the Child
2023/07/28
Committee: LIBE
Amendment 427 #
Proposal for a regulation
Recital 36
(36) In order to prevent children falling victim to online abuse, providers for which there is evidence that their service is routinely or systematically used for the purpose of online child sexual abuse should provide reasonable assistance, by putting in place alert and alarm mechanisms in a prominent way on their platforms. The alert mechanism could consist of, for example, linking potential victims to the local services such as helplines, victims` rights and support organisations or hotlines. They should ensure adequate follow-up, when a report or alert is made, in the language chosen by the user for using their service. Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. receive adequate psycho-social, child friendly and gender-sensitive support and to be assisted by the EU Centre and its relevant partners, such as child helplines or other psycho-social support mechanisms in this regard, via the Coordinating Authorities. Member States should establish and improve the functioning of child helplines and hotlines, including through funding and capacity building, in line with article 96 of Directive (EU) 2018/1972. Victim identification is key not only for tracking down online child sexual abuse but also to prevent victimisation, and to stop further spread of damaging material and to ensure that victims can benefit from available assistance. Such victim identification however, requires a high degree of specialisation and adequate resources. Therefore the European Cybercrime Centre`s efforts in victim identification should be complemented at national level.
2023/07/28
Committee: LIBE
Amendment 437 #
Proposal for a regulation
Recital 49
(49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, or removal orders or blocking orders that it issued, are effectively complied in practicewith, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concernedrelevant searches.
2023/07/28
Committee: LIBE
Amendment 442 #
Proposal for a regulation
Recital 50
(50) With a view to ensuring that providers of hosting services are aware of the misuse made of their services and to afford them an opportunityrequire them to take expeditious action to remove or disable access on a voluntary basis, Coordinating Authorities of establishment should be able to notify those providers of the presence of known child sexual abuse material on their services and requestingorder removal or disabling of access thereof, for the providers’ voluntary consideration. Such notifying activities should be clearly distinguished from the Coordinating Authorities’ powers under this Regulation to request the issuance of removal orders, which impose on the provider concerned a binding legal obligation to remove or disable access to the material in question within a set time period.
2023/07/28
Committee: LIBE
Amendment 445 #
Proposal for a regulation
Recital 55
(55) It is essential for the proper functioning of the system of mandatory detection and blocking of online child sexual abuse set up by this Regulation that the EU Centre receives, via the Coordinating Authorities, material identified as constituting child sexual abuse material or transcripts of conversations identified as constituting the solicitation of children, such as may have been found for example during criminal investigations, so that that material or conversations can serve as an accurate and reliable basis for the EU Centre to generate indicators of such abuses. In order to achieve that result, the identification should be made after a diligent assessment, conducted in the context of a procedure that guarantees a fair and objective outcome, either by the Coordinating Authorities themselves or by a court or another independent administrative authority than the Coordinating Authority. Whilst the swift assessment, identification and submission of such material is important also in other contexts, it is crucial in connection to new child sexual abuse material and the solicitation of children reported under this Regulation, considering that this material can lead to the identification of ongoing or imminent abuse and the rescuing of victims. Therefore, specific time limits should be set in connection to such reporting.
2023/07/28
Committee: LIBE
Amendment 447 #
Proposal for a regulation
Recital 55 a (new)
(55a) All communications containing illegal material should be encrypted to state of the art standards, all access by staff to such content should be limited to what is necessary and thoroughly logged. All such logs should be stored for a minimum of ten years.
2023/07/28
Committee: LIBE
Amendment 449 #
Proposal for a regulation
Recital 56
(56) With a view to ensuring that the indicators generated by the EU Centre for the purpose of detection are as complete as possible, the submission of relevant material and transcripts should be done proactively by the Coordinating Authorities. However, the EU Centre should also be allowed to bring certain material or conversations to the attention of the Coordinating Authorities for those purposes.
2023/07/28
Committee: LIBE
Amendment 453 #
Proposal for a regulation
Recital 58
(58) In particular, in order to facilitate the cooperation needed for the proper functioning of the mechanisms set up by this Regulation, the EU Centre should establish and maintain the necessary secure information-sharing systems. When establishing and maintaining such systems, the EU Centre should cooperate with the European Union Agency for Law Enforcement Cooperation (‘Europol’) and national authorities to build on existing systems and best practices, where relevant.
2023/07/28
Committee: LIBE
Amendment 454 #
Proposal for a regulation
Recital 59
(59) To support the implementation of this Regulation and contribute to the achievement of its objectives, the EU Centre should serve as a central facilitator, carrying out a range of specific tasks. The performance of those tasks requires strong guarantees of independence, in particular from law enforcement authorities, including Europol, as well as a governance structure ensuring the effective, efficient and coherent performance of its different tasks, and legal personality to be able to interact effectively with all relevant stakeholders. Therefore, it should be established as a decentralised Union agency.
2023/07/28
Committee: LIBE
Amendment 457 #
Proposal for a regulation
Recital 60
(60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of hosting services, the provision of assistance to Coordinating Authorities, as well as the generation and sharing of knowledge and expertise related to online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 466 #
Proposal for a regulation
Recital 65
(65) In order to avoid erroneous reporting of online child sexual abuse under this Regulation and to allow law enforcement authorities to focus on their core investigatory tasks without receiving an overwhelming quantity of false positives, reports should pass through the EU Centre. The EU Centre should thoroughly assess those reports in order to identify those that are manifestly unfounded, that is, where it is immediately evident, without anyevident, including after substantive legal orand factual analysis, that the reported activities do not constitute online child sexual abuse. Where the report is manifestly unfounded, the EU Centre should provide feedback to the reporting provider of hosting services or provider of publicly available interpersonal communications services in order to allow for improvements in the technologies and processes used and for other appropriate steps, such as reinstating material wrongly removed. As every report could be an important means to investigate and prosecute the child sexual abuse offences concerned and to rescue the victim of the abuse, reports should be processed as quickly as possible.
2023/07/28
Committee: LIBE
Amendment 469 #
Proposal for a regulation
Recital 66
(66) With a view to contributing to the effective application of this Regulation and the protection of victims’ rights, the EU Centre should be able, upon request, to support victims and to assist Competent Authorities by conducting searches of hosting services for the dissemination of known child sexual abuse material that is publicly accessible, using the corresponding indicators. Where it identifies such material after having conducted such a search, the EU Centre should also be able to requestsend the provider of the hosting service concerned to remove or disable access to the item or items in question, given that the provider may not be aware of their presence and may be willing to do so on a voluntary basisa notice of this manifestly illegal content.
2023/07/28
Committee: LIBE
Amendment 473 #
Proposal for a regulation
Recital 68
(68) Processing and storing certain personal data is necessary for the performance of the EU Centre’s tasks under this Regulation. In order to ensure that such personal data is adequately protected, the EU Centre should only process and store personal data if strictly necessary for the purposes detailed in this Regulation. It should do so in a secure manner, use state of the art encryption, and limit storage to what is strictly necessary for the performance of the relevant tasks. It should ensure adequate protection of its infrastructure and implement facilities access control, storage control, user control, control of data entry, data access control, communication control, input control, transport control, personnel profiles procedures, incident and recovery procedures, and ensure the reliability and integrity of its databases.
2023/07/28
Committee: LIBE
Amendment 480 #
Proposal for a regulation
Recital 72
(72) Considering the need for the EU Centre to cooperate intensively with Europol, the EU Centre’s headquarters should be located alongside Europol’s, which is located in The Hague, the Netherlands. The highly sensitive nature of the reports shared with Europol by the EU Centre and the technical requirements, such as on secure data connections, both benefit from a shared location between the EU Centre and Europol. It would also allow the EU Centre, while being an independent entity, to rely on the support services of Europol, notably those regarding human resources management, information technology (IT), including cybersecurity, the building and communications. Sharing such support services is more cost-efficient and ensure a more professional service than duplicating them by creating them anew.deleted
2023/07/28
Committee: LIBE
Amendment 490 #
Proposal for a regulation
Recital 75
(75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse information, based on anonymised gathering of non- personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question in gathering that information.
2023/07/28
Committee: LIBE
Amendment 492 #
Proposal for a regulation
Recital 77
(77) The evaluation should be based on the criteria of efficiency, necessity, effectiveness, proportionality, relevance, coherence and Union added value. It should assess the functioning of the different operational and technical measures provided for by this Regulation, including the effectiveness of measures to enhance the detection, reporting and removal of online child sexual abuse, the effectiveness of safeguard mechanisms as well as the impacts on potentially affected fundamental rights, the freedom to conduct a business, the right to private life and the protection of personal data. The Commission should also assess the impact on potentially affected interests of third parties.
2023/07/28
Committee: LIBE
Amendment 494 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse, in the internal market. order to contribute to the proper functioning of the internal market and to create a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter are effectively protected.
2023/07/28
Committee: LIBE
Amendment 500 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of publicly available number-independent interpersonal communication services to detect and report online child sexual abuse in specific cases;
2023/07/28
Committee: LIBE
Amendment 506 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point c
(c) obligations on providers of hosting services to remove or disable access to child sexual abuse material on their services;
2023/07/28
Committee: LIBE
Amendment 512 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d
(d) obligations on providers of internet access services to disable access to child sexual abuse material;deleted
2023/07/28
Committee: LIBE
Amendment 519 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point e a (new)
(ea) Obligations on providers of online games.
2023/07/28
Committee: LIBE
Amendment 520 #
Proposal for a regulation
Article 1 – paragraph 2 a (new)
2a. This Regulation shall only apply to services normally provided for remuneration.
2023/07/28
Committee: LIBE
Amendment 521 #
Proposal for a regulation
Article 1 – paragraph 2 b (new)
2b. This Regulation does not apply to audio communications.
2023/07/28
Committee: LIBE
Amendment 522 #
Proposal for a regulation
Article 1 – paragraph 3 – point b
(b) Directive 2000/31/EC and Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065 (Digital Services Act);
2023/07/28
Committee: LIBE
Amendment 534 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
3b. This Regulation shall not undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 544 #
Proposal for a regulation
Article 2 – paragraph 1 – point a
(a) ‘hosting service’ means an information society hosting service as defined in Article 23, point (fg), third indent, of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065 (Digital Services Act);
2023/07/28
Committee: LIBE
Amendment 547 #
Proposal for a regulation
Article 2 – paragraph 1 – point b
(b) ‘number-independent interpersonal communications service’ means a publicly available service as defined in Article 2, point 57, of Directive (EU) 2018/1972, including services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service;
2023/07/28
Committee: LIBE
Amendment 548 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
(ba) ‘number-independent interpersonal communications service within games’ means any service defined in Article 2, point 7 of Directive (EU) 2018/1972 which is part of a game;
2023/07/28
Committee: LIBE
Amendment 550 #
Proposal for a regulation
Article 2 – paragraph 1 – point d
(d) ‘software application store’ means a service as defined in Article 2, point 12, of Regulation (EU) …/… [on contestable and fair markets in the digital sector (Digital Markets Act)];deleted
2023/07/28
Committee: LIBE
Amendment 552 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
(e) ‘internet access service’ means a service as defined in Article 2(2), point 2, of Regulation (EU) 2015/2120 of the European Parliament and of the Council49; _________________ 49 Regulation (EU) 2015/2120 of the European Parliament and of the Council of 25 November 2015 laying down measures concerning open internet access and amending Directive 2002/22/EC on universal service and users’ rights relating to electronic communications networks and services and Regulation (EU) No 531/2012 on roaming on public mobile communications networks within the Union (OJ L 310, 26.11.2015, p. 1– 18).deleted
2023/07/28
Committee: LIBE
Amendment 558 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point ii
(ii) an publicly available number- independent interpersonal communications service;
2023/07/28
Committee: LIBE
Amendment 565 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iii
(iii) a software applications store;deleted
2023/07/28
Committee: LIBE
Amendment 566 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iii a (new)
(iiia) online games;
2023/07/28
Committee: LIBE
Amendment 571 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(ha) (ha) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous report from the public about alleged child sexual abuse material and online child sexual exploitation, which is officially recognised by the Member State of establishment as expressed in Directive 2011/93/EU and its articles of association mention the mission of combatting child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 572 #
Proposal for a regulation
Article 2 – paragraph 1 – point h b (new)
(hb) ‘help-line’ means an organisation providing services for children in need as recognised by the Member State of establishment in line with Directive 2011/93/EU;
2023/07/28
Committee: LIBE
Amendment 580 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 17 yearschild;
2023/07/28
Committee: LIBE
Amendment 611 #
Proposal for a regulation
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the riskpublicly available number- independent interpersonal communications services that are exposed to a substantial amount of online child sexual abuse shall identify, analyse and assess, for each such service that they offer, risks stemming from the design, functioning, including algorithmic recommender systems, ofr use of the service for the purpose of online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 615 #
Proposal for a regulation
Article 3 – paragraph 1 – subparagraph 1 (new)
A hosting service provider or publicly available number-independent interpersonal communication service is exposed to online child sexual abuse where:
2023/07/28
Committee: LIBE
Amendment 616 #
Proposal for a regulation
Article 3 – paragraph 1 – point a (new)
(a) the coordinating authority of the Member State of its main establishment or where its legal representative resides or is established has taken a decision, on the basis of objective factors, such as the provider having received two or more final removal orders in the previous 12 months, finding that the provider is exposed to online child sexual abuse, and notified the decision to the provider; or
2023/07/28
Committee: LIBE
Amendment 617 #
Proposal for a regulation
Article 3 – paragraph 1 – point b (new)
(b) the provider submitted two or more reports of potential online child sexual abuse in the previous 12 months in accordance with Article 12.
2023/07/28
Committee: LIBE
Amendment 623 #
Proposal for a regulation
Article 3 – paragraph 2 – point a a (new)
(aa) any actual or foreseeable negative effects for the exercise of fundamental rights;
2023/07/28
Committee: LIBE
Amendment 626 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability and effectiveness of functionalities to address the risk referred to in paragraph 1, including through the following:
2023/07/28
Committee: LIBE
Amendment 639 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
– functionalities enabling age verificationthe effective protection of children online, in line with children’s increasing need for autonomy and increasing rights to access to information and freedom of expression as they grow;
2023/07/28
Committee: LIBE
Amendment 656 #
Proposal for a regulation
Article 3 – paragraph 2 – point c
(c) the manner in which users use the service and the impact thereof on that risk;deleted
2023/07/28
Committee: LIBE
Amendment 659 #
Proposal for a regulation
Article 3 – paragraph 2 – point d
(d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, the design of their recommender systems and any other relevant algorithmic systems and the impact thereof on that risk;
2023/07/28
Committee: LIBE
Amendment 669 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point ii
(ii) where the service is used or likely to be used by children, the different age groups or likely age groups of the child users and the riskelative scale, frequency and nature of previously identified instances of use of its services for the purpose of solicitation of children in relation to those age groups;
2023/07/28
Committee: LIBE
Amendment 675 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 1
– enabling users to search for other users, including through search engines external to the service, and, in particular, for adult users to search for child users;
2023/07/28
Committee: LIBE
Amendment 681 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 2
– enabling users to establishinitiate unsolicited contact with other users directly, in particular through private communications;
2023/07/28
Committee: LIBE
Amendment 684 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 3
– enabling users to share unsolicited images or videos with other users, in particular through private communications.
2023/07/28
Committee: LIBE
Amendment 692 #
Proposal for a regulation
Article 3 – paragraph 2 – subparagraph 1 (new)
Risk assessment obligations shall never entail a general monitoring obligation, an obligation to seek knowledge about the content of private communications, nor an obligation for providers to seek knowledge of illegal content.
2023/07/28
Committee: LIBE
Amendment 697 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 1
The provider may request the EU Centre to perform an analysis of representative, anonymized data samples to identify potential online child sexual abuse,methodology for risk assessment in order to support the risk assessment.
2023/07/28
Committee: LIBE
Amendment 700 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 2
The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise, provided the request is. The Centre may reject the request where it is not reasonably necessary to support the risk assessment.
2023/07/28
Committee: LIBE
Amendment 702 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 3
The Commission shall be empowered to adopt delegated acts in accordance with Article 86 in order to supplement this Regulation with the necessary detailed rules on the determination and charging of those costs and the application of the exemption for micro, small and medium- sized enterprises.
2023/07/28
Committee: LIBE
Amendment 709 #
Proposal for a regulation
Article 3 – paragraph 4 – subparagraph 2 – point a
(a) for a service which is subject to a detection order issued in accordance with Article 7, the provider shall update the risk assessment at the latest two months beforeafter the expiry of the period of application of the detection order;
2023/07/28
Committee: LIBE
Amendment 710 #
Proposal for a regulation
Article 3 – paragraph 4 – subparagraph 2 – point b
(b) the Coordinating Authority of establishment may require the provider to update the risk assessment at a reasonable earlier date than the date referred to in the second subparagraph, where there is evidence indicating a possible substantial change in the risk that the service is used for the purpose of online child sexual abuse.deleted
2023/07/28
Committee: LIBE
Amendment 716 #
Proposal for a regulation
Article 3 – paragraph 5
5. The risk assessment shall include 5. an assessment of any potential remainingreasonably foreseeable remaining systemic and serious risk that, after taking the mitigation measures pursuant to Article 4, the service is used for the purpose of online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 719 #
Proposal for a regulation
Article 3 – paragraph 6
6. The Commission, in cooperation with Coordinating Authorities, and the EU Centre, after having consulted the European Data Protection Board and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used..
2023/07/28
Committee: LIBE
Amendment 725 #
Proposal for a regulation
Article 4 – title
Risk mitigationSpecific measures
2023/07/28
Committee: LIBE
Amendment 730 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services sthall take reasonable mitigationt are exposed to substantial amount of child sexual abuse material shall take proportionate and effective specific measures, tailored to the serious systemic risk identified pursuant to Article 3, to minimise that risk. . The decision as to the choice of specific measures shall remain with the hosting service provider. Such measures shall include some or all of the following:
2023/07/28
Committee: LIBE
Amendment 738 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditionsin order to expeditiously remove or disable access to child sexual abuse material;
2023/07/28
Committee: LIBE
Amendment 743 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
(aa) providing easily accessible and user-friendly mechanisms for users to report or flag to the provider alleged online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 745 #
Proposal for a regulation
Article 4 – paragraph 1 – point a b (new)
(ab) providing technical measures and tools that allow users to manage their own privacy, visibility, reachability and safety , and that are set to the most private and secure levels by default;
2023/07/28
Committee: LIBE
Amendment 748 #
Proposal for a regulation
Article 4 – paragraph 1 – point a c (new)
(ac) ask for user confirmation before allowing an unknown user to communicate and before displaying their communications;
2023/07/28
Committee: LIBE
Amendment 749 #
Proposal for a regulation
Article 4 – paragraph 1 – point a d (new)
(ad) optionally or by default ask for user confirmation and offer guidance before displaying or sharing certain content such as nudity where the provider ensures that no indication of the process and the content leaves the user’s device and the user is reassured of this;
2023/07/28
Committee: LIBE
Amendment 750 #
Proposal for a regulation
Article 4 – paragraph 1 – point a e (new)
(ae) providing tools in a prominent way on their platform that allow users to seek help from their local help-line;
2023/07/28
Committee: LIBE
Amendment 751 #
Proposal for a regulation
Article 4 – paragraph 1 – point a f (new)
(af) informing and reminding users and non-users, such as parents, at point of need on what constitutes online child sexual abuse and what is typical offender behaviour; offering advice on safe behaviour and the consequences of illegal behaviour in a visible, easy to find and easy to understand way;
2023/07/28
Committee: LIBE
Amendment 752 #
Proposal for a regulation
Article 4 – paragraph 1 – point a g (new)
(ag) informing users and non-users about external resources and services in the user’s region on preventing child sexual abuse, counselling by helplines, victim support and educational resources by hotlines and child protection organisation;
2023/07/28
Committee: LIBE
Amendment 753 #
Proposal for a regulation
Article 4 – paragraph 1 – point a h (new)
(ah) human moderation of publicly accessible chats, based on random checks, and human moderation of publicly accessible, specific channels at high risk of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 754 #
Proposal for a regulation
Article 4 – paragraph 1 – point a i (new)
(ai) providing readily accessible mechanisms for users to block or mute other users;
2023/07/28
Committee: LIBE
Amendment 755 #
Proposal for a regulation
Article 4 – paragraph 1 – point a j (new)
(aj) displaying warnings and advice to users at risk of offending or victimisation where the provider ensures that no indication of the process and the content leaves the user's device and the user is reassured of this;
2023/07/28
Committee: LIBE
Amendment 756 #
Proposal for a regulation
Article 4 – paragraph 1 – point a k (new)
(ak) informing parents on the nature of the service and the functionalities offered as well as on how to report or flag to the provider alleged online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 757 #
Proposal for a regulation
Article 4 – paragraph 1 – point a l (new)
(al) any other mechanisms to increase the awareness of online child sexual abuse on its services;
2023/07/28
Committee: LIBE
Amendment 762 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of number-independent interpersonal communication services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
2023/07/28
Committee: LIBE
Amendment 781 #
Proposal for a regulation
Article 4 – paragraph 2 – introductory part
2. The mitigationspecific measures shall bemeet all of the following requirements:
2023/07/28
Committee: LIBE
Amendment 782 #
Proposal for a regulation
Article 4 – paragraph 2 – point a
(a) effectivthey shall be effective and proportionate in mitigating the identified serious risk;
2023/07/28
Committee: LIBE
Amendment 787 #
Proposal for a regulation
Article 4 – paragraph 2 – point b
(b) they shall be targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technologic, any impact on the functionality of the service as well as the provider’s financial strength, and technical and operational capabilities and, the number of users, and the amount of content they provide;
2023/07/28
Committee: LIBE
Amendment 794 #
Proposal for a regulation
Article 4 – paragraph 2 – point c
(c) they shall be applied in a diligent and non- discriminatory manner, having due regard, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected;
2023/07/28
Committee: LIBE
Amendment 797 #
Proposal for a regulation
Article 4 – paragraph 2 – point d
(d) they shall be introduced, reviewed, discontinued or expanded, as appropriate, each time the risk assessment is conducted or updated pursuant to Article 3(4), within three months from the date referred to therein.
2023/07/28
Committee: LIBE
Amendment 798 #
Proposal for a regulation
Article 4 – paragraph 2 – point d a (new)
(da) they shall respect the principles of data protection by design and by default, as well as of data minimisation.
2023/07/28
Committee: LIBE
Amendment 800 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures.deleted
2023/07/28
Committee: LIBE
Amendment 811 #
Proposal for a regulation
Article 4 – paragraph 3 a (new)
3a. Any requirement to take specific measures shall be without prejudice to Article 8 of Regulation (EU) 2022/2065 [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] and shall entail neither a general obligation for hosting services providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity. Any requirement to take specific measures shall not include an obligation to use ex-ante control measures based on automated tools or upload-filtering of information, to interfere with the secrecy of communications or to restrict the possibility to use a service anonymously.
2023/07/28
Committee: LIBE
Amendment 813 #
Proposal for a regulation
Article 4 – paragraph 3 b (new)
3b. Nothing in this regulation shall be construed as prohibiting, restricting, circumventing or undermining the provision or the use of encrypted services.
2023/07/28
Committee: LIBE
Amendment 815 #
Proposal for a regulation
Article 4 – paragraph 4
4. PWhere appropriate, providers of hosting services and providers of number- independent interpersonal communications services shall clearly describe in their terms and conditionsof service the mitigation measures that they have taken. That description shall not include information that mayis likely to reduce the effectiveness of the mitigation measures.
2023/07/28
Committee: LIBE
Amendment 821 #
Proposal for a regulation
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 42, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used.
2023/07/28
Committee: LIBE
Amendment 827 #
Proposal for a regulation
Article 4 a (new)
Article4a Specific measures for platforms primarily used for the dissemination of pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure a. user-friendly reporting mechanisms to report alleged child sexual abuse material; b. adequate professional human content moderation to rapidly process notices of alleged child sexual abuse material; c. automatic mechanisms and interface design elements to inform users about external resources in the user’s region on preventing child sexual abuse, counselling by specialist helplines, victim support and educational resources by hotlines and child protection organisations; d. automatic detection of searches for child sexual abuse material, warning and advice alerts displayed to users doing such searches, and flagging of the search and the user for human moderation;
2023/07/28
Committee: LIBE
Amendment 828 #
Article4b Specific measures for number- independent interpersonal communications service within games Providers of online games that operate number-independent interpersonal communications service within their games, and which are exposed to a substantial amount of online child sexual abuse, shall take all of the following specific measures in addition to the requirements referred to Article 4: 1. prevent users from initiating unsolicited contact with other users; 2. facilitate user-friendly reporting of alleged child sexual abuse material; 3. provide technical measures and tools that allow users to manage their own privacy, visibility, reachability and safety and that are set to the most private and secure levels by default; 4. provide tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line.
2023/07/28
Committee: LIBE
Amendment 830 #
Proposal for a regulation
Article 5 – title
Risk reporting and oversight
2023/07/28
Committee: LIBE
Amendment 831 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number-independent interpersonal communications services to which Article 3 applies shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
2023/07/28
Committee: LIBE
Amendment 837 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
(a) the process and the results of the risk assessment conducted or updated pursuant to Article 3, including the assessment of any potential remainingreasonably foreseeable remaining systemic serious risk referred to in Article 3(5);
2023/07/28
Committee: LIBE
Amendment 839 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) any mitigationspecific measures taken pursuant to Articles 4, 4a and 4b.
2023/07/28
Committee: LIBE
Amendment 840 #
Proposal for a regulation
Article 5 – paragraph 2
2. Within three months after receiving the report, the Coordinating Authority of establishment shall assess it and determine, on that basis and taking into account any other relevant information available to it, whether the risk assessment has been carried out or updated and the mitigation measurespecific measures and implementation plans have been taken in accordance with the requirements of Articles 3 and 4.
2023/07/28
Committee: LIBE
Amendment 842 #
Proposal for a regulation
Article 5 – paragraph 3 – subparagraph 1
Where necessary for that assessment, that Coordinating Authority may require further information from the provider, within a reasonable time period set by that Coordinating Authority. That time period shall not be longer than two weeks.to be provided without undue delay,
2023/07/28
Committee: LIBE
Amendment 843 #
Proposal for a regulation
Article 5 – paragraph 3 – subparagraph 2
The time period referred to in the first subparagraph shall be suspended until that additional information is providdeleted.
2023/07/28
Committee: LIBE
Amendment 844 #
Proposal for a regulation
Article 5 – paragraph 4
4. Without prejudice to Articles 7 and 27 to 29, where the requirements of Articles 3 and 4 have not been met, that Coordinating Authority shall require the provider to re-conduct ormake specific updates to the risk assessment or to introduce, review, discontinue or expand, as applicable, the mitigation measurestake the necessary measures so as to ensure that Articles 3 and 4 are complied with, within a reasonable time period set by that Coordinating Authority. That time period shall not be longer than one month. The provider may choose the type of specific measures to take.
2023/07/28
Committee: LIBE
Amendment 846 #
Proposal for a regulation
Article 5 – paragraph 4 a (new)
4a. The provider may, at any time, request the competent Coordinating authority to review and, where appropriate, amend or revoke a decision as referred to in paragraph 4. The authority shall, within three months of receipt of the request, adopt a reasoned decision on the request based on objective factors and notify the provider of that decision.
2023/07/28
Committee: LIBE
Amendment 849 #
Proposal for a regulation
Article 5 – paragraph 6
6. Providers shall, upon request, transmit the report to the providers of software application stores, insofar as necessary for the assessment referred to in Article 6(2). Where necessary, they may remove confidential information from the reports.deleted
2023/07/28
Committee: LIBE
Amendment 852 #
Proposal for a regulation
Article 6
1. Providers of software application stores shall: (a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children; (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; (c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b). 2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3. 3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures. 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular Article 6 deleted Obligations for software application sto relevant technological developments and to the manners in which the services covered by those provisions are offered and used.s
2023/07/28
Committee: LIBE
Amendment 889 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of publicly available number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific servicematerial in images or videos contained in the uploads or communications of one or more specific users of that service, where there is reasonable suspicion of child sexual abuse offences committed by these users.
2023/07/28
Committee: LIBE
Amendment 897 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
1a. Interpersonal communications to which end to end encryption is, has been or will be applied, shall not be subject to the measures specified in Article 10.
2023/07/28
Committee: LIBE
Amendment 902 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 1
The Coordinating Authority of establishment shall, before requesting the issuance of a detection order, carry out the investigations and assessments necessary to determine whether the conditions of paragraph 4 have been met.
2023/07/28
Committee: LIBE
Amendment 908 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – introductory part
Where the Coordinating Authority of establishment takes the preliminary view that the conditions of paragraph 4 have been met, it shall:
2023/07/28
Committee: LIBE
Amendment 911 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point a
(a) establish a draft request for the issuance of a detection order, specifying the factual and legal grounds upon which the request is based, the main elements of the content of the detection order it intends to request and the reasons for requesting it;
2023/07/28
Committee: LIBE
Amendment 916 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point c
(c) afford the provider an opportunity to comment on the draft request, within a reasonable time period set by that Coordinating Authority;deleted
2023/07/28
Committee: LIBE
Amendment 924 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – introductory part
Where, having regard to the comments of the provider and the opinion of the EU Centre, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have been met, it shall re-submit the draft request, adjusted where appropriate, to the provider. In that case,quest the judicial validation of the detection order from the competent judicial authority responsible for the issuing of such orders pursuant to paragraph 4. Upon receipt of judicial validation of the order, the Coordinating Authority shall submit the order, adjusted where appropriate, to the provider. Prior to requesting the judicial validation of the detection order, the Coordinating Authority shall request the provider shallto do all of the following, within a reasonable time period set by that Coordinating Authority:
2023/07/28
Committee: LIBE
Amendment 929 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point a
(a) draft an implementation plan setting out the specific person or persons the authority intends to investigate, the measures it envisages taking to execute the intended detection order, including detailed information regarding the envisaged technologies and safeguards;
2023/07/28
Committee: LIBE
Amendment 933 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point b
(b) where the draft implementation plan concerns an intended detection order concerning the solicitation of childrenreasonable suspicion that the dissemination of child sexual abuse material is conducted by one or more specific users , and where other than the renewal of a previously issued detection order without any substantive changes, conduct a data protection impact assessment and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation plan;
2023/07/28
Committee: LIBE
Amendment 939 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point c
(c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary, in view of the outcome of the data protection impact assessment and in order to take intoutmost account of the opinion of the data protection authority provided in response to the prior consultation;
2023/07/28
Committee: LIBE
Amendment 942 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point d
(d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted in viewto take full account of the outcome of the data protection impact assessment and of that opinion.
2023/07/28
Committee: LIBE
Amendment 947 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 3
Where, having regard to the implementation plan of the provider and having utmost regard to the opinion of the data protection authority, that Coordinating Authority continues to be ofis the view that the conditions of paragraph 4 have met, it shall submit the request for the validation and issuance of the detection, adjusted where appropriate, to the competent judicial authority or independent administrative authoritye. It shall attach the implementation plan of the provider and the opinions of the EU Centre and the data protection authority to that request.
2023/07/28
Committee: LIBE
Amendment 954 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
TBased on a reasoned justification, the Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority or independent administrative authority shall issue the detection order where it considers that the following conditions are met:
2023/07/28
Committee: LIBE
Amendment 958 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a
(a) there is evidence of a significant riskamounting to a reasonable suspicion of the service being used for the purpose of online child sexual abusedisseminating or receiving child sexual abuse material by one or more users, within the meaning of paragraphs 5, 6 and 7, as applicable;
2023/07/28
Committee: LIBE
Amendment 965 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b
(b) the reasons for issuing the detection order is necessary and proportionate and outweighs negative consequences for the rights and legitimate interests of all parties affected, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties.
2023/07/28
Committee: LIBE
Amendment 974 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2
When assessing whether the conditions of the first subparagraph have been met, account shall be taken of all relevant facts and circumstances of the case at hand, in particular: (a) the risk assessment conducted or updated and any mitigation measures taken by the provider pursuant to Articles 3 and 4, including any mitigation measures introduced, reviewed, discontinued or expanded pursuant to Article 5(4) where applicable; (b) any additional information obtained pursuant to paragraph 2 or any other relevant information available to it, in particular regarding the use, design and operation of the service, regarding the provider’s financial and technological capabilities and size and regarding the potential consequences of the measures to be taken to execute the detection order for all other parties affected; (c) the views and the implementation plan of the provider submitted in accordance with paragraph 3; (d) the opinions of the EU Centre and of the data protection authority submitted in accordance with paragraph 3.deleted
2023/07/28
Committee: LIBE
Amendment 988 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 3
As regards the second subparagraph, point (d), where that Coordinating Authority substantially deviates from the opinion of the EU Centre, it shall inform the EU Centre and the Commission thereof, specifying the points at which it deviated and the main reasons for the deviation.deleted
2023/07/28
Committee: LIBE
Amendment 994 #
Proposal for a regulation
Article 7 – paragraph 5
5. As regards detection orders concerning the dissemination of known child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely, despite any mitigation measures that the provider may have taken or will take, that the service is used, to an appreciable extent for the dissemination of known child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent for the dissemination of known child sexual abuse material.deleted
2023/07/28
Committee: LIBE
Amendment 1003 #
Proposal for a regulation
Article 7 – paragraph 6
6. As regards detection orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the dissemination of new child sexual abuse material; (c) for services other than those enabling the live transmission of pornographic performances as defined in Article 2, point (e), of Directive 2011/93/EU: (1) a detection order concerning the dissemination of known child sexual abuse material has been issued in respect of the service; (2) the provider submitted a significant number of reports concerning known child sexual abuse material, detected through the measures taken to execute the detection order referred to in point (1), pursuant to Article 12.deleted
2023/07/28
Committee: LIBE
Amendment 1007 #
Proposal for a regulation
Article 7 – paragraph 7
7. As regards detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) the provider qualifies as a provider of interpersonal communication services; (b) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the solicitation of children; (c) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the solicitation of children. The detection orders concerning the solicitation of children shall apply only to interpersonal communications where one of the users is a child user.deleted
2023/07/28
Committee: LIBE
Amendment 1020 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the judicial validation and issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary to effectively address the significant risk referred to in point (a) thereofand proportionate to obtain the information required to effectively investigate the case, and collect the information required to assess the existence of a criminal offence.
2023/07/28
Committee: LIBE
Amendment 1027 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
To that aimend, they shall take into account all relevant parameters, including the availability of sufficiently reliable detection technologies in that they limit to the maximum extent possible the rate of errors regarding the detection and their suitability and effectiveness for achieving the objectives of this Regulation, including their likelihood to inaccurately detect lawful speech as illegal content, as well as the impact of the measures on the rights of the users affected, and require the taking of the least intrusive measures, in accordance with Article 10, from among several equally effective measures. Methods used to detect child sexual abuse material shall be able to distinguish between lawful and unlawful content without the need for any independent human assessment.
2023/07/28
Committee: LIBE
Amendment 1032 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point a
(a) where that riske suspicion is limited to an identifiabl discrete part or component of a service, the required measures are only applied in respect of the uploads and communications of the suspects via that part or component;
2023/07/28
Committee: LIBE
Amendment 1033 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point b
(b) where necessary, in particular to limit such negative consequences, effective and proportionate safeguards additional to those listed in Article 10(4), (5) and (65) are provided for;
2023/07/28
Committee: LIBE
Amendment 1038 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 1
The competent judicial authority or independent administrative authority shall specify in the detection order the period during which it applies, indicating the start date and the end date.
2023/07/28
Committee: LIBE
Amendment 1039 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 2
The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the detection order. It shall not be earlier than three months from the date at which the provider received the detection order and not be later than 12 months from that date.
2023/07/28
Committee: LIBE
Amendment 1042 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3
The period of application of detection orders concerning the dissemination of known or new child sexual abuse material shall not exceed 24 months and that of detection orders concerning the solicitation of children shall not exceed 12 monthsshall be proportionate, taking all relevant factors into account.
2023/07/28
Committee: LIBE
Amendment 1062 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detection orders referred to in Article 7 using the template set out in Annex I. Detection orders shall include:
2023/07/28
Committee: LIBE
Amendment 1064 #
Proposal for a regulation
Article 8 – paragraph 1 – point a
(a) information regarding the specific measures to be taken to execute the detection order, including the specific person or specific persons the detection must concern, the temporal scope, indicators to be used and the safeguards to be provided for, including the reporting requirements set pursuant to Article 9(3) and, where applicable, any additional safeguards as referred to in Article 7(8);
2023/07/28
Committee: LIBE
Amendment 1067 #
Proposal for a regulation
Article 8 – paragraph 1 – point b
(b) identification details of the competent judicial authority or the independent administrative authority issuing the detection order and authentication of the detection order by that judicial or independent administrative authority;
2023/07/28
Committee: LIBE
Amendment 1073 #
Proposal for a regulation
Article 8 – paragraph 1 – point e
(e) whether the detection order issued concerns the dissemination of known or new child sexual abuse material or the solicitation of children;deleted
2023/07/28
Committee: LIBE
Amendment 1079 #
Proposal for a regulation
Article 8 – paragraph 1 – point g
(g) a sufficiently detailed statement of reasons explaining why the detection order is issued;
2023/07/28
Committee: LIBE
Amendment 1080 #
Proposal for a regulation
Article 8 – paragraph 1 – point h
(h) the factual and legal grounds justifying the issuing of the order, and a reference to this Regulation as the legal basis for the detection order;
2023/07/28
Committee: LIBE
Amendment 1082 #
Proposal for a regulation
Article 8 – paragraph 1 – point i
(i) the date, time stamp and electronic signature of the judicial or independent administrative authority issuing the detection order;
2023/07/28
Committee: LIBE
Amendment 1087 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1
The competent judicial authority or independent administrative authority issuing the detection order shall address it to the main establishment of the provider or, where applicable, to its legal representative designated in accordance with Article 24.
2023/07/28
Committee: LIBE
Amendment 1090 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 2
The detection order shall be securly transmitted to the provider’s point of contact referred to in Article 23(1), to the Coordinating Authority of establishment and to the EU Centre, through the system established in accordance with Article 39(2).
2023/07/28
Committee: LIBE
Amendment 1093 #
Proposal for a regulation
Article 8 – paragraph 3
3. If the provider cannot execute the detection order because it contains manifest errorserrors, it appears unnecessary or disproportionate, or does not contain sufficient information for its execution, the provider shall, without undue delay, request the necessary corrections or clarifications to the Coordinating Authority of establishment, using the template set out in Annex II.
2023/07/28
Committee: LIBE
Amendment 1097 #
Proposal for a regulation
Article 8 a (new)
Article8a Preservation of data in the context of detection orders 1. Detection orders may require the expedited preservation by the provider, insofar as the data is under their control, of one or more of the following data concerning the specific users against whom the detection order is directed, including new data generated after issuance of the order, as part of a planned or current criminal investigation; a) Traffic data: i) Pseudonyms, screen names or other identifiers used by the subject(s) of the investigation; ii) Network identifiers, such as IP addresses, port numbers, or MAC addresses used by, or associated with, the subject(s) of the investigation; iii) Any other traffic data, including metadata, of any activity linked to subject(s) of the investigation; b) Content data: i) Copies of any pictures or videos uploaded, downloaded or otherwise communicated by the subject(s) of the investigation; 2. Access to the data shall be made available to law enforcement authorities on the basis of the national law of the country of establishment of the provider. 3. The provider shall inform all users concerned of the order, unless the issuing authority instructs it, on the basis of a reasoned opinion, not to do so.
2023/07/28
Committee: LIBE
Amendment 1107 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of hosting services and providers of number-independent interpersonal communications services that have received a detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detection order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
2023/07/28
Committee: LIBE
Amendment 1110 #
Proposal for a regulation
Article 9 – paragraph 2 – subparagraph 1
When the detection order becomes final, the competent judicial authority or independent administrative authority that issued the detection order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).
2023/07/28
Committee: LIBE
Amendment 1114 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 1
Where the period of application of the detection order exceeds 12 months, or six months in the case of a detection order concerning the solicitation of children, the Coordinating Authority of establishment shall require the provider to report to it on the execution of the detection order at least once, halfway through the period of application.
2023/07/28
Committee: LIBE
Amendment 1117 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 2
Those reports shall include a detailed description of the measures taken to execute the detection order, including the safeguards provided, and information on the functioning in practice of those measures, in particular on their effectiveness in detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, and on the consequences of those measures for the rights and legitimate interests of all parties affected.
2023/07/28
Committee: LIBE
Amendment 1121 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 1
In respect of the detection orders that the competent judicial authority or independent administrative authority issued at its request, the Coordinating Authority of establishment shall, where necessary and in any event following reception of the reports referred to in paragraph 3, assess whether any substantial changes to the grounds for issuing the detection orders occurred and, in particular, whether the conditions of Article 7(4) continue to be met. In that regard, it shall take account of additional mitigation measures that the provider may take to address the significant riskreasonable suspicion or evidence identified at the time of the issuance of the detection order.
2023/07/28
Committee: LIBE
Amendment 1123 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 2
That Coordinating Authority shall request to the competent judicial authority or independent administrative authority that issued the detection order the modification or revocation of such order, where necessary in the light of the outcome of that assessment. The provisions of this Section shall apply to such requests, mutatis mutandis.
2023/07/28
Committee: LIBE
Amendment 1132 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that have received a detection order shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators providedaccording to article 7 shall execute it to collect evidence on one or more specific user’s child sexual abuse offences, using, if necessary, specific technologies approved for this purpose by the EU Centre in accordance with Article 46.
2023/07/28
Committee: LIBE
Amendment 1135 #
Proposal for a regulation
Article 10 – paragraph 2
2. The provider shall be entitled to acquire, install and operate, free of charge, technologies specified in the orders and made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made availabletechnologies relied on, regardless of whether provided by the EU service or procured or developped by the provider itself, shall be audited independently as per their performance, and the results of these audits as well as the benchmarks used to measure the performance shall be made publicly-available. Relying on technologies provided by the EU Centrer shall not affecexempt the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to provider from the obligation to conduct a prior data protection impact assessment, as referred to in Article 35 of Regulation (EU) 2016/679, and a prior consultation procedure, as referred to in Article 36 of that Regulation. The prior consultation shall include access of the supervisory as a result of the use of the technologiesuthority to the algorithm and the databases the content is matched against.
2023/07/28
Committee: LIBE
Amendment 1142 #
Proposal for a regulation
Article 10 – paragraph 3 – introductory part
3. The technologies specified in the detection orders shall be:
2023/07/28
Committee: LIBE
Amendment 1143 #
Proposal for a regulation
Article 10 – paragraph 3 – point a
(a) be effective in detcollecting evidence on the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
2023/07/28
Committee: LIBE
Amendment 1146 #
Proposal for a regulation
Article 10 – paragraph 3 – point b
(b) not be able to extract nor deduce the substance of the content of the communications or any other information, from the relevant communications other than the information strictly necessary to detect, using the indicators referred to in paragraph 1, patterns pointing to the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
2023/07/28
Committee: LIBE
Amendment 1150 #
Proposal for a regulation
Article 10 – paragraph 3 – point c
(c) be in accordance with the technological state of the art in the industry and the least intrusive in terms of the impact on the users’ rights to private and family life, including the confidentiality of communication, and to protection of personal data;
2023/07/28
Committee: LIBE
Amendment 1151 #
(d) be sufficiently reliable, in that they limit to the maximum extent possible the rate of errors regarding the detection.where content is wrongly identified as known child sexual abuse material (“false positives”) to at most 1 in 50 billion, and where such occasional errors occur, their consequences are rectified without delay;
2023/07/28
Committee: LIBE
Amendment 1154 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) for searching known child sexual abuse material, create a unique, non- reconvertible digital signature (ʻhashʼ) of electronically communicated pictures or videos for the sole purpose of immediately comparing that hash with a database containing hashes of material previously reliably identified as child sexual abuse and exploitation material as provided by the EU Centre pursuant to Article 44(1);
2023/07/28
Committee: LIBE
Amendment 1164 #
Proposal for a regulation
Article 10 – paragraph 3 – point d b (new)
(db) ensure the processing is limited to what is strictly necessary for the purpose of detection, reporting and removal of child sexual abuse material and, unless child sexual abuse material has been detected and confirmed as such, the data is erased immediately;
2023/07/28
Committee: LIBE
Amendment 1165 #
Proposal for a regulation
Article 10 – paragraph 3 – point d c (new)
(dc) ensure the processing does not interfere with, weaken, or circumvent the security of encrypted communications, and only applies to unencrypted communications;
2023/07/28
Committee: LIBE
Amendment 1168 #
Proposal for a regulation
Article 10 – paragraph 4 – introductory part
4. The providerissuing authority shall:
2023/07/28
Committee: LIBE
Amendment 1172 #
Proposal for a regulation
Article 10 – paragraph 4 – point a
(a) take all the necessary measures to ensure that the technologies specified in detection orders and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly necessary to execute the detection orders addressed to themthey issue;
2023/07/28
Committee: LIBE
Amendment 1175 #
Proposal for a regulation
Article 10 – paragraph 4 – point b
(b) establish effectiveinclude in detection orders specific internal procedures for providers to prevent and, where necessary, detect and remedy any misuse of the technologies, indicators and personal data and other data referred to in point (a), including unauthorized access to, and unauthorised transfers of, such personal data and other data;
2023/07/28
Committee: LIBE
Amendment 1178 #
Proposal for a regulation
Article 10 – paragraph 4 – point c
(c) include in detection orders specific obligations on providers ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, human intervention;
2023/07/28
Committee: LIBE
Amendment 1181 #
Proposal for a regulation
Article 10 – paragraph 4 – point d
(d) establish and operate an accessible, agechild-appropriate and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of itsproviders’ obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
2023/07/28
Committee: LIBE
Amendment 1196 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point a
(a) the fact that it operates technologies to detect online child sexual abuse to execute the detection order, the ways in which it operates those technologies and the impact on the confidentiality of users’ communications;
2023/07/28
Committee: LIBE
Amendment 1197 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point b
(b) the fact that it is required to report potential online child sexual abuse to the EU Centre in accordance with Article 12;deleted
2023/07/28
Committee: LIBE
Amendment 1201 #
Proposal for a regulation
Article 10 – paragraph 6
6. Where a provider detects potential online child sexual abuse through the measures taken to execute the detection order, it shall inform the users concerned without undue delay, after Europol or the national law enforcement authority of a Member State that received the report pursuant to Article 48 has confirmed that the information to the users would not interfere with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.deleted
2023/07/28
Committee: LIBE
Amendment 1207 #
Proposal for a regulation
Article 11 – title
GuidelinAdditional rules regarding detection obligations
2023/07/28
Committee: LIBE
Amendment 1211 #
Proposal for a regulation
Article 11 – paragraph 1
The Commission, in cooperation with the Coordinating Authorities, and the EU Centre, after having consulted the European Data Protection Board and after having conducted a public consultation, may issue guidelinedelegated acts on the application of Articles 7 to 10, having due regard in particular to relevant technological developments and the manners in which the services covered by those provisions are offered and used.
2023/07/28
Committee: LIBE
Amendment 1213 #
Proposal for a regulation
Chapter II – Section 3 – title
3 Reporting and removal obligations
2023/07/28
Committee: LIBE
Amendment 1214 #
Reporting and removal obligations
2023/07/28
Committee: LIBE
Amendment 1216 #
Proposal for a regulation
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of publicly available number-independent interpersonal communications services becomes awarehas actual knowledge of alleged online child sexual abuse on its services in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the EU Centre, it shall promptly submit, using state of the art encryption, a report to the EU Centre and the relevant compent national authority in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
2023/07/28
Committee: LIBE
Amendment 1219 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1a. Where a provider of hosting services has actual knowledge of online child sexual abuse material on its services and of its unlawful nature it shall expeditiously remove or disable access to it in all Member States.
2023/07/28
Committee: LIBE
Amendment 1220 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 1
Where the provider submits a report pursuant to paragraph 1, it shall inform the user concerned, providingrequest authorisation from the EU Centre to notify the user concerned, which shall reply without undue delay, at maximum within two days. In case of authorisation, the provider shall notify the user without undue delay. The notification shall include information on the main content of the report, on the manner in which the provider has become aware of the potentialalleged child sexual abuse concerned, on the authority the report has been transferred to, on the follow-up given to the report insofar as such information is available to the provider and on the user’s possibilities of redress, including on the right to submit complaints to the Coordinating Authority in accordance with Article 34.
2023/07/28
Committee: LIBE
Amendment 1222 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 2
The provider shall inform the user concerned without undue delay, either after having received a communication from the EU Centre indicating that it considers the report to be manifestly unfounded as referred to in Article 48(2), or after the expiry of a time period of three months from the date of the report without having received a communication from the EU Centre indicating that the information is not to be provided as referred to in Article 48(6), point (a), whichever occurs first.deleted
2023/07/28
Committee: LIBE
Amendment 1225 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 3
Where within the three months’ time period referred to in the second subparagraph the provider receives such a communication from the EU Centre indicating that the information is not to be provided, it shall inform the user concerned, without undue delay, after the expiry of the time period set out in that communication.deleted
2023/07/28
Committee: LIBE
Amendment 1227 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. The EU Center shall coordinate with the relevant competent authority the requests it receives for exercise of individuals’ right of access, rectification and deletion in relation to personal data processed pursuant to this Regulation.
2023/07/28
Committee: LIBE
Amendment 1228 #
Proposal for a regulation
Article 12 – paragraph 3
3. The provider shall establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to flag to the provider potential online child sexual abuse on the service.deleted
2023/07/28
Committee: LIBE
Amendment 1234 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number-independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
2023/07/28
Committee: LIBE
Amendment 1242 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) all content data, including images, videos and text being reported;
2023/07/28
Committee: LIBE
Amendment 1249 #
Proposal for a regulation
Article 13 – paragraph 1 – point d a (new)
(da) a list of all traffic data and metadata retained in relation to the potential online child sexual abuse, which could be made available to law enforcement authorities, together with information concerning default retention periods.
2023/07/28
Committee: LIBE
Amendment 1252 #
Proposal for a regulation
Article 13 – paragraph 1 – point e
(e) whether the potential online child sexual abuse to their knowledge concerns the dissemination of known or new child sexual abuse material or the solicitation of children;
2023/07/28
Committee: LIBE
Amendment 1255 #
Proposal for a regulation
Article 13 – paragraph 1 – point f
(f) information concerning the apparent geographic location related to the potential online child sexual abuse, such as the Internet Protocol address;
2023/07/28
Committee: LIBE
Amendment 1257 #
Proposal for a regulation
Article 13 – paragraph 1 – point g
(g) a list of available information concernindicating the identity of any user involved in the potential online child sexual abuse together with default retention periods;
2023/07/28
Committee: LIBE
Amendment 1261 #
Proposal for a regulation
Article 13 – paragraph 1 – point i
(i) where the potentialalleged online child sexual abuse concerns the dissemination of known or new child sexual abuse material, whether the provider has removed or disabled access to the material;
2023/07/28
Committee: LIBE
Amendment 1263 #
Proposal for a regulation
Article 13 – paragraph 1 – point i a (new)
(ia) information on the specific technology that enabled the provider to become aware of the relevant abusive content, in case the provider became aware of the potential child sexual abuse following measures taken to execute a detection order issued in accordance with Article 7 of the Proposal.
2023/07/28
Committee: LIBE
Amendment 1267 #
Proposal for a regulation
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting illegal child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1275 #
Proposal for a regulation
Article 14 – paragraph 3 – introductory part
3. The competent judicial authority or the independent administrative authority shall issue a removal order using the template set out in Annex IV. Removal orders shall include:
2023/07/28
Committee: LIBE
Amendment 1276 #
Proposal for a regulation
Article 14 – paragraph 3 – point a
(a) identification details of the judicial or independent administrative authority issuing the removal order and authentication of the removal order by that authority;
2023/07/28
Committee: LIBE
Amendment 1277 #
Proposal for a regulation
Article 14 – paragraph 3 – point c
(c) the specific service for which the removal order is issudeleted;
2023/07/28
Committee: LIBE
Amendment 1279 #
Proposal for a regulation
Article 14 – paragraph 3 – point h
(h) the date, time stamp and electronic signature of the judicial or independent administrative authority issuing the removal order;
2023/07/28
Committee: LIBE
Amendment 1280 #
Proposal for a regulation
Article 14 – paragraph 4 – subparagraph 1
The judicial authority or the independent administrative issuing the removal order shall address it to the main establishment of the provider or, where applicable, to its legal representative designated in accordance with Article 24.
2023/07/28
Committee: LIBE
Amendment 1284 #
Proposal for a regulation
Article 14 – paragraph 8 a (new)
8a. Where Europol or a national authority become aware of the presence of child sexual abuse material on a hosting service, they shall notify the Coordinating authority of its exact uniform resource locator, and the Coordinating authority shall request a removal order where the conditions of paragraph 1 are met.
2023/07/28
Committee: LIBE
Amendment 1285 #
Proposal for a regulation
Article 15 – paragraph 1
1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the removal order.
2023/07/28
Committee: LIBE
Amendment 1287 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1
When the removal order becomes final, the competent judicial authority or independent administrative authority that issued the removal order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).
2023/07/28
Committee: LIBE
Amendment 1290 #
Proposal for a regulation
Article 15 – paragraph 3 – point b
(b) the reasons for the removal or disabling, providing a copy of the removal order upon the user’s request;
2023/07/28
Committee: LIBE
Amendment 1291 #
4. The Coordinating Authority of establishment may request, when requesting the judicial authority or independent administrative authority issuing the removal order, and after having consulted with relevant public authorities, that the provider is not to disclose any information regarding the removal of or disabling of access to the child sexual abuse material, where and to the extent necessary to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences. In such a case: (a) the judicial authority or independent administrative authority issuing the removal order shall set the time period not longer than necessary and not exceeding six weeks, during which the provider is not to disclose such information; (b) the obligations set out in paragraph 3 shall not apply during that time period; (c) that judicial authority or independent administrative authority shall inform the provider of its decision, specifying the applicable time period. That judicial authority or independent administrative authority may decide to extend the time period referred to in the second subparagraph, point (a), by a further time period of maximum six weeks, where and to the extent the non- disclosure continues to be necessary. In that case, that judicial authority or independent administrative authority shall inform the provider of its decision, specifying the applicable time period. Article 14(3) shall apply to that decision.deleted
2023/07/28
Committee: LIBE
Amendment 1295 #
Proposal for a regulation
Chapter II – Section 5
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1325 #
Proposal for a regulation
Article 19
Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements.Article 19 deleted Liability of providers
2023/07/28
Committee: LIBE
Amendment 1334 #
Proposal for a regulation
Article 20 – title
Victims’ right to information and support
2023/07/28
Committee: LIBE
Amendment 1336 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
Persons residingVictims of child sexual abuse material hosted or disseminated in the Union or their representatives and persons in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they areside, or a Coordinating Authority of their choosing, easily understandable and accessible information regarding any known instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12. The right shall cover both an occasional information as well as a periodic information. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them. The information in question shall be given in the language indicated by that person.
2023/07/28
Committee: LIBE
Amendment 1344 #
Proposal for a regulation
Article 20 – paragraph 1 a (new)
1a. Victims of child sexual abuse or their representatives and persons living in the Union shall have the right to receive, upon their request, from the Coordinating Authority information regarding victim’s rights, support and assistance.The information shall be age-appropriate, accessible and gender-sensitive and shall include at a minimum: (a) the type of support they can obtain and from whom, including, where relevant, basic information about access to medical support, any specialist support, including psychological or social support, and alternative accommodation; (b) the procedures for making complaints with regard to a criminal offence and their role in connection with such procedures; (c) how and under what conditions they can obtain protection, including protection measures; (d) how and under what conditions they can access legal advice, legal aid and any other sort of advice; (e) how and under what conditions they can access compensation; (f) how and under what conditions they are entitled to interpretation and translation.
2023/07/28
Committee: LIBE
Amendment 1345 #
Proposal for a regulation
Article 20 – paragraph 1 b (new)
1b. In case a victim or victim representative indicates the preference for a periodic request, the Coordinating Authority shall submit, without delay, the information referred to in paragraph 3 proactively to the requester after the first submitted reply, in any new instances of reports referred to in paragraph 1 on a weekly basis. Victims or victim representatives may terminate the periodic request at any time by notifying the Coordinating Authority in question.
2023/07/28
Committee: LIBE
Amendment 1346 #
Proposal for a regulation
Article 20 – paragraph 2 – point b
(b) where applicable, the individual or entity that is to receive the information on behalfformally assisting or representing the person that is to receive the information on behalf of the person making the request, with verifiable proof of approval of the person making the request;
2023/07/28
Committee: LIBE
Amendment 1347 #
Proposal for a regulation
Article 20 – paragraph 2 – point c
(c) sufficient elements to demonstrverify thate the identitychild sexual abuse material in question matches with of the person making the request.;
2023/07/28
Committee: LIBE
Amendment 1348 #
Proposal for a regulation
Article 20 – paragraph 2 – point c a (new)
(ca) an indication if the request is occasional or covers a certain time period.
2023/07/28
Committee: LIBE
Amendment 1349 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
(d) whether the provider reported having removed or disabled access to the material, in accordance with Article 13(1), point (i)., and in that case, all related information;
2023/07/28
Committee: LIBE
Amendment 1352 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
(da) if there were appeals to such removal, and in that case, all related information
2023/07/28
Committee: LIBE
Amendment 1353 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
(db) relevant age-appropriate, accessible and gender-sensitive information on victim support and assistance in the victim’s region.
2023/07/28
Committee: LIBE
Amendment 1356 #
Proposal for a regulation
Article 21 – paragraph 1
1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.deleted
2023/07/28
Committee: LIBE
Amendment 1362 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
PVictims of child sexual abuse material hosted or disseminated in the Union or their representatives or persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting or the Coordinating Authority of their choosing, age appropriate and gender-sensitive information on support for removal, including support from civil society organisations, hotlines and from the EU Centre when they seek to have a provider of hosting services or publicly available number-independent interpersonal communications services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
2023/07/28
Committee: LIBE
Amendment 1369 #
Proposal for a regulation
Article 21 – paragraph 3
3. The requests referred to in paragraphs 1 and 2 shall indicate the relevant item or items of child sexual abuse material and any other relevant information.
2023/07/28
Committee: LIBE
Amendment 1372 #
Proposal for a regulation
Article 21 – paragraph 4 – point d
(d) where necessary, informing the Coordinating Authority of establishment of the presence of that item or those items on the provider's service, with a view to the issuance of a removal order pursuant to Article 14. and the obligations under Article 21;
2023/07/28
Committee: LIBE
Amendment 1373 #
Proposal for a regulation
Article 21 – paragraph 4 – point d a (new)
(da) information regarding victim’s rights, assistance and support pursuant to Article 21.
2023/07/28
Committee: LIBE
Amendment 1376 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Providers of hosting services and providers of number-independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
2023/07/28
Committee: LIBE
Amendment 1382 #
(e) responding to requests issued by competent law enforcement authorities and judicial authorities in accordance with the applicable law, with a view to providing them with the necessary information for the prevention, detection, investigation or prosecution of child sexual abuse offences, insofar as the content data and other data relate to a report that the provider has submitted to the EU Centre pursuant to Article 12. All such requests shall be logged.
2023/07/28
Committee: LIBE
Amendment 1386 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 2
As regards the first subparagraph, point (a), the provider may also preserve the information for the purpose of improving the effectiveness and accuracy of thewho uses its own detection technologies to detect online child sexual abuse for the execution of a detection order issued to it in accordance with Article 7. However, it shall not store any personal data may also preserve the information for the purpose of improving the effectiveness and accuracy of these technologies, if the personal data preserved this way is fully anonymised. No personal data shall be retained for that purpose.
2023/07/28
Committee: LIBE
Amendment 1387 #
Proposal for a regulation
Article 22 – paragraph 2 – subparagraph 1
Providers shall securely preserve the information referred to in paragraph 1 for no longer than necessary for the applicable purpose and, in any event, no longer than 12 months from the date of the reporting or of the removal or disabling of access, whichever occurs first.
2023/07/28
Committee: LIBE
Amendment 1390 #
Proposal for a regulation
Article 22 – paragraph 2 – subparagraph 3
Providers shall ensure that the information referred to in paragraph 1 is preserved in a secure manner and that the preservation is subject to appropriate technical and organisational safeguards. Those safeguards shall ensure, in particular, that the information can be accessed and processed only for the purpose for which it is preserved, that a high level of security is achieved, all access to the data is logged, and that the information is deleted upon the expiry of the applicable time periods for preservation. Providers shall regularly review those safeguards and adjust them where necessary.
2023/07/28
Committee: LIBE
Amendment 1394 #
Proposal for a regulation
Article 25 – paragraph 1
1. Member States shall, by [Date - two months from the date of entry into force of this Regulation], designate one or more competent authorities as responsible for the application and enforcement of this Regulation and to the achievement of the objective of this Regulation and enforcement of Directive 2011/93/EU (‘competent authorities’).
2023/07/28
Committee: LIBE
Amendment 1397 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 2
The Coordinating Authority shall be responsible for all matters related to application and enforcement of this Regulation, and to the achievement of the objective of this Regulation and enforcement of Directive 2011/93/EU in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities.
2023/07/28
Committee: LIBE
Amendment 1400 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 3
The Coordinating Authority shall in any event be responsible for ensuring coordination at national level in respect of those matters, including matters related to prevention, and for contributing to the effective, efficient and consistent application and enforcement of this Regulation and Directive 2011/93/EU throughout the Union.
2023/07/28
Committee: LIBE
Amendment 1401 #
Proposal for a regulation
Article 25 – paragraph 5
5. Each Member State shall ensure that a sufficiently staffed contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the appliccontributing to the achievements of the objective of this Regulation and enforcement of this RegulationDirective 2011/93/EU in that Member State. Member States shall make the information on the contact point publicly available and communicate it, including for trusted organisations providing assistance to victims and providing education and awareness raising. Member States shall make the information on the contact point widely accessible through gender-sensitive and age-appropriate online and offline awareness raising campaigns and communicate this information to the EU Centre. They shall keep that information updated.
2023/07/28
Committee: LIBE
Amendment 1403 #
Proposal for a regulation
Article 25 – paragraph 6
6. Within two weeks after the designation of the Coordinating Authorities pursuant to paragraph 2, the EU Centre shall set up an online public register listing the Coordinating Authorities and their contact points. The EU Centre shall regularly publish any modification thereto.
2023/07/28
Committee: LIBE
Amendment 1404 #
Proposal for a regulation
Article 25 – paragraph 7 – point a
(a) provide certain information or technical expertise on matters covered by this Regulation;deleted
2023/07/28
Committee: LIBE
Amendment 1406 #
Proposal for a regulation
Article 25 – paragraph 7 – point a a (new)
(aa) provide information and expertise on gender-sensitive and age appropriate victim support and prevention of online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 1407 #
Proposal for a regulation
Article 25 – paragraph 7 – point b
(b) assist in assessing, in accordance with Article 5(2), the risk assessment conducted or updated or the mitigation measures taken by a provider of hosting or interpersonal communication services under the jurisdiction of the Member State that designated the requesting Coordinating Authority;deleted
2023/07/28
Committee: LIBE
Amendment 1409 #
Proposal for a regulation
Article 25 – paragraph 7 – point c
(c) verify the possible need to request competent national authorities to issue a detection order, a removal order or a blocking order in respect of a service under the jurisdiction of the Member State that designated that Coordinating Authority;deleted
2023/07/28
Committee: LIBE
Amendment 1413 #
Proposal for a regulation
Article 25 – paragraph 7 – point d
(d) verify the effectiveness of a detection order or a removal order issued upon the request of the requesting Coordinating Authority.deleted
2023/07/28
Committee: LIBE
Amendment 1416 #
Proposal for a regulation
Article 25 – paragraph 8
8. The EU Centre shall provide such assistance free of charge and in accordance with its tasks and obligations under this Regulation and insofar as its resources and priorities allow.
2023/07/28
Committee: LIBE
Amendment 1420 #
Proposal for a regulation
Article 25 a (new)
Article25a Cooperation with third parties Where necessary for the performance of its tasks under this Regulation, including the achievement of the objective of this regulation, and in order to promote the generation and sharing of knowledge in line with article 43 (6), the Coordinating Authority shall cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations and semi-public organisations and practitioners.
2023/07/28
Committee: LIBE
Amendment 1421 #
Proposal for a regulation
Article 26 – paragraph 1
1. Member States shall ensure that the Coordinating Authorities that they designated perform their tasks under this Regulation in an objective, impartial, transparent and timely manner, while fully respecting theall fundamental rights of all parties affected. They shall also ensure that their Coordinating Authorities perform their tasks with utmost respect and sensitivity towards victims and their representatives, with a focus on avoidance of re-victimization, the safety of the victim and their needs. Member States shall also ensure that their Coordinating Authorities have adequate technical, financial and human resources to carry out their tasks.
2023/07/28
Committee: LIBE
Amendment 1424 #
Proposal for a regulation
Article 26 – paragraph 2 – point e
(e) are not charged with tasks relating to the prevention or combating of child sexual abuse, other than their tasks under this Regulation.deleted
2023/07/28
Committee: LIBE
Amendment 1426 #
Proposal for a regulation
Article 26 – paragraph 4
4. The Coordinating Authorities shall ensure that relevant members of staff have the required qualifications, experience and technical skills to perform their duties. under this Regulation. They shall also ensure that members of staff coming into contact with victims are adequately and frequently trained in intersectional victim support.
2023/07/28
Committee: LIBE
Amendment 1427 #
Proposal for a regulation
Article 26 – paragraph 5
5. TWithout prejudice to national or Union legislation on whistleblower protection, the management and other staff of the Coordinating Authorities shall, in accordance with Union or national law, be subject to a duty of professional secrecy both during and after their term of office, with regard to any confidential information which has come to their knowledge in the course of the performance of their tasks. Member States shall ensure that the management and other staff are subject to rules guaranteeing that they can carry out their tasks in an objective, impartial and independent manner, in particular as regards their appointment, dismissal, remuneration and career prospects. Coordinating Authorities shall take into account the application of Directive 2021/93/EU on Pay Transparency.
2023/07/28
Committee: LIBE
Amendment 1428 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Where needed for carrying out their tasks, Coordinating Authorities shall have the following powers of investigation,investigatory powers in respect of providers of relevant information society services under the jurisdiction of the Member State that designated them:
2023/07/28
Committee: LIBE
Amendment 1430 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, to provide such information within a reasonable time periodundue delay;
2023/07/28
Committee: LIBE
Amendment 1432 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b) the power to carry out, or to request an independent judicial authority in their Member State to order remote or on-site inspections of any premises that those providers or the other persons referred to in point (a) use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement of this Regulation in any form, irrespective of the storage medium;
2023/07/28
Committee: LIBE
Amendment 1435 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
(d) the power to request information, including to assess whether the measures taken to execute a detection order, removal order or blocking order complyto assess compliance with the requirements of this Regulation.
2023/07/28
Committee: LIBE
Amendment 1438 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Where needed for carrying out their tasks, Coordinating Authorities shall have the following enforcement powers, in respect of providers of relevant information society services under the jurisdiction of the Member State that designated them:
2023/07/28
Committee: LIBE
Amendment 1441 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
(b) the power to order specific measures to bring about the cessation of infringements of this Regulation and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end;
2023/07/28
Committee: LIBE
Amendment 1443 #
Proposal for a regulation
Article 28 – paragraph 1 – point c
(c) the power to impose fines, or request a judicial authority in their Member State to do so, in accordance with Article 35 for infringements of this Regulation, including non-compliance with any of the orders issued pursuant to Article 27 and to point (b) of this paragraph;
2023/07/28
Committee: LIBE
Amendment 1444 #
Proposal for a regulation
Article 28 – paragraph 1 – point e
(e) the power to adopt appropriate, reasonable, and proportionate interim measures to avoid the risk ofprevent serious harm.
2023/07/28
Committee: LIBE
Amendment 1449 #
Proposal for a regulation
Article 29 – paragraph 1 – introductory part
1. Where needed for carrying out their tasks, Coordinating Authorities shall have the additional enforcement powers referred to in paragraph 2 of this Article, in respect of providers of relevant information society services under the jurisdiction of the Member State that designated them, provided that:
2023/07/28
Committee: LIBE
Amendment 1450 #
Proposal for a regulation
Article 29 – paragraph 1 – point b
(b) the infringement persists; and
2023/07/28
Committee: LIBE
Amendment 1451 #
Proposal for a regulation
Article 29 – paragraph 2 – point a – point i
(i) adopt and submit an action plan setting out the necessary measures to terminate the infringement, subject to the approval of the Coordinating Authority;
2023/07/28
Committee: LIBE
Amendment 1452 #
Proposal for a regulation
Article 29 – paragraph 2 – point b – introductory part
(b) request the competent judicial authority or independent administrative authority of the Member State that designated the Coordinating Authority to order the temporary restriction of access of users of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider on which the infringement takes place, where the Coordinating Authority considers that:
2023/07/28
Committee: LIBE
Amendment 1453 #
Proposal for a regulation
Article 29 – paragraph 2 – point b – point ii
(ii) the infringement persists and causes serious harm that is greater than the likely harm to users relying on the service for legal purposes and;
2023/07/28
Committee: LIBE
Amendment 1454 #
Proposal for a regulation
Article 29 – paragraph 4 – subparagraph 2
The temporary restriction shall apply for a period of four weeks, subject to the possibility for the competent judicial authority, in its order, to allow the Coordinating Authority to extend that period for further periods of the same lengths, subject to a maximum number of extensions set by that judicial authority.
2023/07/28
Committee: LIBE
Amendment 1455 #
Proposal for a regulation
Article 29 – paragraph 4 – subparagraph 3 – point a
(a) the provider has failed to take the necessary and proportionate measures to terminate the infringement; and
2023/07/28
Committee: LIBE
Amendment 1457 #
Proposal for a regulation
Article 30 – paragraph 2
2. Member States shall ensure that any exercise of the investigatory and enforcement powers referred to in Articles 27, 28 and 29 is subject to adequate safeguards laid down in the applicable national law to respect the fundamental rights of all parties affected. In particular, those measures shall onlybe targetd and precise, be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all parties affected.
2023/07/28
Committee: LIBE
Amendment 1459 #
Proposal for a regulation
Article 31 – paragraph 1
Coordinating Authorities shall have the power to carry out searches on publicly accessible material on hosting services to detect the dissemination of known or new child sexual abuse material, using the indicators contained in the databases referred to in Article 44(1), points (a) and (b), where necessary to verify whether the providers of hosting services under the jurisdiction of the Member State that designated the Coordinating Authorities comply with their obligations under this Regulation.
2023/07/28
Committee: LIBE
Amendment 1461 #
Proposal for a regulation
Article 32
Notification of known child sexual abuse Coordinating Authorities shall have the power to notify providers of hosting services under the jurisdiction of the Member State that designated them of the presence on their service of one or more specific items of known child sexual abuse material and to request them to remove or disable access to that item or those items, for the providers’ voluntary consideration. The request shall clearly set out the identification details of the Coordinating Authority making the request and information on its contact point referred to in Article 25(5), the necessary information for the identification of the item or items of known child sexual abuse material concerned, as well as the reasons for the request. The request shall also clearly state that it is for the provider’s voluntary consideration.Article 32 deleted material
2023/07/28
Committee: LIBE
Amendment 1463 #
Proposal for a regulation
Article 33 – paragraph 2 – subparagraph 2
Where a provider which does not have its main establishment in the Union failed to appoint a legal representative in accordance with Article 24, all Member States shall have jurisdiction. Where a Member State decides to exercise jurisdiction under this subparagraph, it shall inform all other Member States and ensure that the principle of ne bis in idem is respected.
2023/07/28
Committee: LIBE
Amendment 1464 #
Proposal for a regulation
Article 34 – paragraph 1
1. Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to lodge a complaint alleging an infringement of this Regulation affecting them against providers of relevant information society services with the Coordinating Authority designated by the Member State where the user resides or is established.
2023/07/28
Committee: LIBE
Amendment 1466 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a. During these proceedings, both parties shall have the right to be heard and receive appropriate information about the status of the complaint, in accordance with national law
2023/07/28
Committee: LIBE
Amendment 1467 #
Proposal for a regulation
Article 34 – paragraph 1 b (new)
1b. The Coordinating authority shall offer easy to use mechanisms to anonymously submit information about infringements of this Regulation.
2023/07/28
Committee: LIBE
Amendment 1468 #
Proposal for a regulation
Article 34 – paragraph 2
2. Coordinating Authorities shall provide child-friendlyage-appropriate and accessible mechanisms to submit a complaint under this Article and adopt a childn age-appropriate and gender-sensitive approach when handling complaints submitted by children, taking due account of the child'person’s age, maturity, views, needs and concerns. The processing of complaints shall take into account due diligence and shall provide necessary information to the complainant.
2023/07/28
Committee: LIBE
Amendment 1472 #
Proposal for a regulation
Article 35 – paragraph 2
2. Member States shall ensure that the maximum amount of penalties imposed for an infringement of this Regulation shall not exceed 6 % of the annual income or globalworldwide turnover of the preceding business year of the provider.
2023/07/28
Committee: LIBE
Amendment 1474 #
Proposal for a regulation
Article 35 – paragraph 3
3. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information or to submit to an on-site inspection shall not exceed 1% of the annual income or globalworldwide turnover of the preceding business year of the provider or the other person referred to in Article 27.
2023/07/28
Committee: LIBE
Amendment 1476 #
Proposal for a regulation
Article 35 – paragraph 4
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily globalworldwide turnover of the provider or the other person referred to in Article 27 in the preceding financial year per day, calculated from the date specified in the decision concerned.
2023/07/28
Committee: LIBE
Amendment 1480 #
Proposal for a regulation
Article 35 a (new)
Article35a Compensation Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to seek, in accordance with Union and national law, compensation from providers of relevant information society services, for any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.
2023/07/28
Committee: LIBE
Amendment 1482 #
Proposal for a regulation
Article 36 – paragraph 1 – subparagraph 1 – point a
(a) anonymised specific items of material and transcripts of conversations that Coordinating Authorities or that the competent judicial authorities or other independent administrativethat the competent judicial authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material or the solicitation of children, as applicable, for the EU Centre to generate indicators in accordance with Article 44(3);
2023/07/28
Committee: LIBE
Amendment 1485 #
Proposal for a regulation
Article 36 – paragraph 1 – subparagraph 1 – point b
(b) exact uniform resource locators indicating specific items of material that Coordinating Authorities or that competent judicial authorities or other independent administrative authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material, hosted by providers of hosting services not offering services in the Union, that cannot be removed due to those providers’ refusal to remove or disable access thereto and to the lack of cooperation by the competent authorities of the third country having jurisdiction, for the EU Centre to compile the list of uniform resource locators in accordance with Article 44(3).deleted
2023/07/28
Committee: LIBE
Amendment 1488 #
Proposal for a regulation
Article 36 – paragraph 1 – subparagraph 2
Member States shall take the necessary measures to ensure that the Coordinating Authorities that they designated receive, without undue delay, theencrypted copies of material identified as child sexual abuse material, the transcripts of conversations identified as the solicitation of children, and the uniform resource locators, identified by a competent judicial authority or other independent administrative authority than the Coordinating Authority, for submission to the EU Centre in accordance with the first subparagraph.
2023/07/28
Committee: LIBE
Amendment 1491 #
Proposal for a regulation
Article 36 – paragraph 2
2. Upon the request of the EU Centre where necessary to ensure that the data contained in the databases referred to in Article 44(1) are complete, accurate and up-to-date, Coordinating Authorities shall verify or provide clarifications or additional information as to whether the conditions of paragraph 1, points (a) and (b) have been and, where relevant, continue to be met, in respect of a given submission to the EU Centre in accordance with that paragraph.deleted
2023/07/28
Committee: LIBE
Amendment 1492 #
Proposal for a regulation
Article 36 – paragraph 3
3. Member States shall ensure that, where their law enforcement authorities receive a report of the dissemination of new child sexual abuse material or of the solicitation of children forwarded to them by the EU Centre in accordance with Article 48(3), a diligent assessment is conducted in accordance with paragraph 1 and, if the material or conversation is identified as constituting child sexual abuse material or as the solicitation of children, the Coordinating Authority submits the material to the EU Centre, in accordance with that paragraph, within one monthweek from the date of reception of the report or, where the assessment is particularly complex, two months from that date.
2023/07/28
Committee: LIBE
Amendment 1494 #
Proposal for a regulation
Article 36 – paragraph 4
4. They shall also ensure that, where the diligent assessment indicates that the material does not constitute child sexual abuse material or the solicitation of children, the Coordinating Authority is informed of that outcome and subsequently informs the EU Centre thereof, within the time periods specified in the first subparagraphone week from the date of the reception of such assessment. Member States shall establish effective procedures that such material, including any associated data, which does not constitute child sexual abuse material is deleted from the records and databases at the Coordinating Authority and the Member States law enforcement authorities within one week after having received the notice about it.
2023/07/28
Committee: LIBE
Amendment 1498 #
Proposal for a regulation
Article 37 – paragraph 1 – subparagraph 2
Where the Commission has reason, in the reasoned opinion of the Commission, there are grounds to suspect that a provider of relevant information society services infringed this Regulation in a manner involvcausing harm ing at least three Member States, it may recommend that the Coordinating Authority of establishment assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
2023/07/28
Committee: LIBE
Amendment 1503 #
Proposal for a regulation
Article 37 – paragraph 2 – point c
(c) any other information that the Coordinating Authority that sent the request, or the Commission, considers relevant, including, where appropriate, information gathered on its own initiative and suggestions for specific investigatory or enforcement measures to be taken.
2023/07/28
Committee: LIBE
Amendment 1504 #
Proposal for a regulation
Article 37 – paragraph 3 – subparagraph 1
The Coordinating Authority of establishment shall assess the suspected infringement, taking into utmost account the request or recommendation referred to in paragraph 1.
2023/07/28
Committee: LIBE
Amendment 1507 #
Where it considers that it has insufficient information to assess the suspected infringement or to act upon the request or recommendation and has reasons to consider that the Coordinating Authority that sent the request, or the Commission, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided.
2023/07/28
Committee: LIBE
Amendment 1509 #
Proposal for a regulation
Article 37 – paragraph 4
4. The Coordinating Authority of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation referred to in paragraph 1, communicate to the Coordinating Authority that sent the request, or the Commission, the outcome of its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and, where applicable, an explanationdetails of the investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation.
2023/07/28
Committee: LIBE
Amendment 1511 #
Proposal for a regulation
Article 38 – paragraph 1 – subparagraph 1
Coordinating Authorities shall share best practice standards and guidance on the detection and removal of child sexual abuse material and may participate in joint investigations, which may be coordinated with the support of the EU Centre, of matters covered by this Regulation, concerning providers of relevant information society services that offer their services in several Member States.
2023/07/28
Committee: LIBE
Amendment 1512 #
Proposal for a regulation
Article 38 – paragraph 1 a (new)
1a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to remove content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
2023/07/28
Committee: LIBE
Amendment 1518 #
Proposal for a regulation
Article 39 – paragraph 2
2. The EU Centre shall establish and maintain one or moreuse the software provided by eu-LISA pursuant to Regulation (EU) [Joint Investigation Teams online collaboration platform] to establish and maintain a reliable and secure information sharing systems supporting communications between Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services. In accordance with Article 88 of Regulation (EU) 2018/1725, the EU Centre shall keep logs of its processing operations. It shall not be possible to modify the logs.
2023/07/28
Committee: LIBE
Amendment 1522 #
Proposal for a regulation
Article 39 – paragraph 3
3. The Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation. Regulation (EU) [Joint Investigation Teams online collaboration platform] shall apply mutatis mutandis.
2023/07/28
Committee: LIBE
Amendment 1528 #
Proposal for a regulation
Chapter IV – title
IV EU CENTRE TO PREVENT AND COMBAT CHILD SEXUAL ABUSEOTECT CHILDREN
2023/07/28
Committee: LIBE
Amendment 1531 #
Proposal for a regulation
Article 40 – paragraph 1
1. A European Union Agency to prevent and combat child sexual abuseotect children, the EU Centre on Child Sexual AbuseProtection, is established.
2023/07/28
Committee: LIBE
Amendment 1542 #
Proposal for a regulation
Article 42 – paragraph 1
. The choice of the location of the seat of the EU Centre shall be The Hague, The Netherlands.made in accordance with the ordinary legislative procedure. The following criteria shall in particular be respected when assessing the possible choices of location for the EU Center: (a) it shall not affect the EU Centre’s execution of its tasks or the organisation of its governance structure; (b) it shall not comprise its independence vis-à-vis EU Member States or EU institutions, bodies and agencies, in particular Europol; (e) it shall ensure a balanced geographical distribution of EU institutions, bodies and agencies across the Union;
2023/07/28
Committee: LIBE
Amendment 1545 #
Proposal for a regulation
Article 43 – title
43 Tasks of the EU Centre on Child Protection
2023/07/28
Committee: LIBE
Amendment 1546 #
Proposal for a regulation
Article 43 – paragraph -1 (new)
-1 The objective of the Agency shall be to provide the relevant institutions, bodies, offices and agencies of the EU and its Member States as well as civil society organisations and research bodies when involved with implementing EU law with assistance, expertise and coordination in relation to the preventing and combating of child sexual abuse, in order to support them when taking measures or formulating courses of action within their respective spheres of competence in full respect of fundamental rights
2023/07/28
Committee: LIBE
Amendment 1547 #
Proposal for a regulation
Article 43 – paragraph 1 – point 1 – point a
(a) supporting the Commission in the preparation of the guidelines referred to in Article 3(86), Article 4(5), Article 6(4) and Article 11, including by collecting and providing relevant information, expertise and best practices, taking into account advice from the Technology Committee and the Survivor’s Advisory Board referred to in Article 66 and 66a (new);
2023/07/28
Committee: LIBE
Amendment 1550 #
Proposal for a regulation
Article 43 – paragraph 1 – point 1 – point b
(b) upon request from a provider of relevant information services, providing an analysis of anonymised data samples for the purpose referred to in Article 3(3);deleted
2023/07/28
Committee: LIBE
Amendment 1552 #
Proposal for a regulation
Article 43 – paragraph 1 – point 1 – point b a (new)
(ba) operating accounts, including child accounts, on publicly available number-independent interpersonal communications services and reporting relevant findings concerning the risk of solicitation of children to the Coordinating Authority of establishment; where the Centre becomes aware of potential online child sexual abuse, Article 48(3) of this Regulation shall apply mutatis mutandis;
2023/07/28
Committee: LIBE
Amendment 1555 #
Proposal for a regulation
Article 43 – paragraph 1 – point 2 – point b
(b) maintaining and operating the databases of indicators referred to in Article 44of known child sexual abuse material;
2023/07/28
Committee: LIBE
Amendment 1559 #
Proposal for a regulation
Article 43 – paragraph 1 – point 4 – introductory part
(4) facilitate the removal process referred to in Section 4 of Chapter II and the other processes referred to in Section 5 and 6 of that Chapter, by:
2023/07/28
Committee: LIBE
Amendment 1562 #
Proposal for a regulation
Article 43 – paragraph 1 – point 4 – point b
(b) cooperating with and responding to requests of Coordinating Authorities in connection to intended blocking orders as referred to in Article 16(2);deleted
2023/07/28
Committee: LIBE
Amendment 1565 #
Proposal for a regulation
Article 43 – paragraph 1 – point 4 – point c
(c) receiving and processing the blocking orders transmitted to it pursuant to Article 17(3);deleted
2023/07/28
Committee: LIBE
Amendment 1567 #
Proposal for a regulation
Article 43 – paragraph 1 – point 4 a (new)
(4a) conduct proactive searches of publicly accessible content on hosting services for known child sexual abuse material in accordance with Article 49;
2023/07/28
Committee: LIBE
Amendment 1571 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point a
(a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51, including education, awareness raising and intervention programmes, and facilitating the drafting of recommendations and guidelines on prevention and mitigation of child sexual abuse, in particular in the digital space and taking into account technological developments;
2023/07/28
Committee: LIBE
Amendment 1573 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point a a (new)
(aa) supporting awareness-raising and prevention campaigns in the Union carried out by public and private bodies, stakeholders and education institutions, and elaborating best practices in this regard;
2023/07/28
Committee: LIBE
Amendment 1574 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b
(b) supporting the development and dissemination of research and expertise on those matters and on assistance to victims, including by serving as a hub of expertise to support evidence-based policy and by linking researchers to practitioners;
2023/07/28
Committee: LIBE
Amendment 1578 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) contribute to the implementation of awareness campaigns as per the potential risks posed by the online environment to children, in order to equip them with adequate skills for detecting potential grooming and deceit, to ensure safe use of the internet by children;
2023/07/28
Committee: LIBE
Amendment 1581 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b b (new)
(bb) assisting with expertise and knowledge in the development and implementation of teacher training across the Union, in order to vest teachers with the necessary skills for guiding children on safely using information society services and detecting potentially malicious behaviour online;
2023/07/28
Committee: LIBE
Amendment 1584 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b c (new)
(bc) supporting the collaboration of victim support services and elaborating best practices;
2023/07/28
Committee: LIBE
Amendment 1586 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b d (new)
(bd) supporting the exchange of law enforcement agencies and providers and elaborating best practices;
2023/07/28
Committee: LIBE
Amendment 1599 #
Proposal for a regulation
Article 44 – paragraph 1 – introductory part
1. The EU Centre shall create, maintain and operate databases of the following three types of indicators of online child sexual abuse material:
2023/07/28
Committee: LIBE
Amendment 1601 #
Proposal for a regulation
Article 44 – paragraph 1 – point a
(a) indicators to detect the dissemination of child sexual abuse material previously detected and identified as constituting child sexual abuse material in accordance with Article 36(1);
2023/07/28
Committee: LIBE
Amendment 1604 #
(b) indicators to detect the dissemination of child sexual abuse material not previously detected and identified as constituting child sexual abuse material in accordance with Article 36(1);deleted
2023/07/28
Committee: LIBE
Amendment 1605 #
Proposal for a regulation
Article 44 – paragraph 1 – point c
(c) indicators to detect the solicitation of children.deleted
2023/07/28
Committee: LIBE
Amendment 1609 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
(a) relevant indicators, consisting of digital identifiers to be used to detect the dissemination of known or newknown child sexual abuse material or the solicitation of children, as applicable, on hosting services and number-independent interpersonal communications services, generated by the EU Centre in accordance with paragraph 3;
2023/07/28
Committee: LIBE
Amendment 1611 #
Proposal for a regulation
Article 44 – paragraph 2 – point b
(b) as regards paragraph 1, point (a), the relevant indicators shall include a list of uniform resource locators compiled by the EU Centre in accordance with paragraph 3;deleted
2023/07/28
Committee: LIBE
Amendment 1613 #
Proposal for a regulation
Article 44 – paragraph 2 – point c
(c) the necessary additional information to facilitate the use of the indicators in accordance with this Regulation, including identifiers allowing for a distinction between images, videos and, where relevant, other types of material for the detection of the dissemination of known and newknown child sexual abuse material and language identifiers for the detection of solicitation of children.
2023/07/28
Committee: LIBE
Amendment 1614 #
Proposal for a regulation
Article 44 – paragraph 3 – subparagraph 1
The EU Centre shall generate the indicators referred to in paragraph 2, point (a), solely on the basis of thefrom child sexual abuse material and the solicitation of children identified as such by the Coordinating Authorities or the courts or other independentidentified as such by the competent judicial authorities of the Member States, submitted to it by the Coordinating Authorities pursuant to Article 36(1), point (a).
2023/07/28
Committee: LIBE
Amendment 1616 #
Proposal for a regulation
Article 44 – paragraph 3 – subparagraph 2
The EU Centre shall compile the list of uniform resource locators referred to in paragraph 2, point (b), solely on the basis of the uniform resource locators submitted to it pursuant to Article 36(1), point (b).deleted
2023/07/28
Committee: LIBE
Amendment 1617 #
Proposal for a regulation
Article 44 – paragraph 4
4. The EU Centre shall keep records of the submissions and of the process applied to generate the indicators and compile the list referred to in the first and second subparagraphs. It shall keep those records for as long as the indicators, including the uniform resource locators, to which they correspond are contained in the databases of indicators referred to in paragraph 1.
2023/07/28
Committee: LIBE
Amendment 1620 #
Proposal for a regulation
Article 45 – paragraph 1
1. The EU Centre shall create, maintain and operate a database for the reports submitted to it by providers of hosting services and providers of number- independent interpersonal communications services in accordance with Article 12(1) and assessed and processed in accordance with Article 48.
2023/07/28
Committee: LIBE
Amendment 1622 #
(b) where the EU Centre considered the report unfounded or manifestly unfounded, the reasons and the date and time of informing the provider in accordance with Article 48(2);
2023/07/28
Committee: LIBE
Amendment 1629 #
Proposal for a regulation
Article 45 – paragraph 2 – point e
(e) where available, information indicating that the provider that submitted a report concerning the dissemination of known or new child sexual abuse material removed or disabled access to the material;
2023/07/28
Committee: LIBE
Amendment 1636 #
Proposal for a regulation
Article 46 – paragraph 2
2. The EU Centre shall give providers of hosting services, providers of number- independent interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to execute the detection or blocking orders that they received in accordance with Articles 7 or 16. It shall take measures to ensure that such access remains limited to what is strictly necessary for the period of application of the detection or blocking orders concerned and that such access does not in any way endanger the proper operation of those databases and the accuracy and security of the data contained therein.
2023/07/28
Committee: LIBE
Amendment 1640 #
Proposal for a regulation
Article 46 – paragraph 4
4. The EU Centre shall give Europol and the competent law enforcement authorities of the Member States access to the databases of indicators referred to in Article 44 where and to the extent necessary for the performance of their tasks of investigating suspected child sexual abuse offences.
2023/07/28
Committee: LIBE
Amendment 1641 #
Proposal for a regulation
Article 46 – paragraph 4 a (new)
4a. The EU Centre shall give Europol access to the databases of indicators referred to in Article 44 only limited to specific data, such as hit/no hit procedure, and solely if necessary for the performance of their tasks of investigating cross-border cases of suspected child sexual abuse offences.
2023/07/28
Committee: LIBE
Amendment 1642 #
Proposal for a regulation
Article 46 – paragraph 5
5. The EU Centre shall give Europol access to the databases of reports referred to in Article 45, where and to the extent necessary for the performance of its tasks of assisting investigations of suspected child sexual abuse offencesdeleted
2023/07/28
Committee: LIBE
Amendment 1647 #
Proposal for a regulation
Article 46 – paragraph 6 – subparagraph 1
The EU Centre shall provide the access referred to in paragraphs 2, 3, 4 and 5 only upon the reception of a request, specifying the purpose of the request, the modalities of the requested access, and the degree of access needed to achieve that purpose. The requests for the access referred to in paragraph 2 shall also include a reference to the detection order or the blocking order, as applicable.
2023/07/28
Committee: LIBE
Amendment 1648 #
Proposal for a regulation
Article 46 – paragraph 6 – subparagraph 2
The EU Centre shall duly and diligently assess those requests on a case-by-case basis, and only grant access where it considers that the requested access is necessary for and proportionate to the specified purpose. Where it considers that an access request by Europol is necessary and proportionate, it shall provided the relevant data to Europol via the Secure Information Exchange Network Application (SIENA).
2023/07/28
Committee: LIBE
Amendment 1650 #
Proposal for a regulation
Article 46 – paragraph 7
7. The EU Centre shall regularly verify that the data contained in the databases referred to in Articles 44 and 45 is, in all respects, complete, accurate and up-to-date and continues to be necessary for the purposes of reporting, detection and blocking in accordance with this Regulation, as well as facilitating and monitoring of accurate detection technologies and processes. In particular, as regards the uniform resource locators contained in the database referred to Article 44(1), point (a), the EU Centre shall, where necessary in cooperation with the Coordination Authorities, regularly verify that the conditions of Article 36(1), point (b), continue to be met. Those verifications shall include audits, where appropriate. Where necessary in view of those verifications, it shall immediately complement, adjust or delete the data.
2023/07/28
Committee: LIBE
Amendment 1658 #
Proposal for a regulation
Article 47 – paragraph 1 – point b
(b) the processing of the submissions by Coordinating Authorities, the generation of the indicators, the compilation of the list of uniform resource locators and the record-keeping, referred to in Article 44(3);
2023/07/28
Committee: LIBE
Amendment 1660 #
Proposal for a regulation
Article 47 – paragraph 1 – point d
(d) access to the databases referred to in Articles 44 and 45, including the modalities of the access referred to in Article 46(1) to (5), the content, processing and assessment of the requests referred to in Article 46(6), procedural matters related to such requests and the necessary measures referred to in Article 46(6);deleted
2023/07/28
Committee: LIBE
Amendment 1665 #
Proposal for a regulation
Article 48 – paragraph 1
1. The EU Centre shall expeditiously assess and process reports submitted by providers of hosting services and providers of interpersonal communications services in accordance with Article 12 to determine whether the reports are manifestly unfounded or are to be forwarded.
2023/07/28
Committee: LIBE
Amendment 1668 #
Proposal for a regulation
Article 48 – paragraph 2
2. Where the EU Centre considers that the report is manifestly unfounded, it shall inform the provider that submitted the report, specifying the reasons why it considers the report to be unfounded. In cases of unfounded reports the EU Centre shall capture a cryptographic hash value from the reported file and shall store it together with the name of the provider who submitted the report and the date when it was submitted solely for statistical purposes. The unfounded report and any personal data related to it shall be deleted not later than 24h after the provider was informed.
2023/07/28
Committee: LIBE
Amendment 1671 #
Proposal for a regulation
Article 48 – paragraph 3 – subparagraph 1
Where, after a thorough legal and factual assessment, the EU Centre considers that a report is not manifestly unfoundedunfounded and actionable, it shall forward the report, together with any additional relevant information available to it, to Europol and to the competent law enforcement authority or authorities of the Member State likely to have jurisdiction to investigate or prosecute the potential child sexual abuse to which the report relates.
2023/07/28
Committee: LIBE
Amendment 1675 #
Proposal for a regulation
Article 48 – paragraph 3 – subparagraph 2
WOnly where that competent law enforcement authority or those competent law enforcement authorities cannot be determined with sufficient certainty by a thorough factual assessment, the EU Centre shall forward the report, together with any additional relevant information available to it, to Europol, for further analysis and subsequent referral by Europol to the competent law enforcement authority or authorities. .
2023/07/28
Committee: LIBE
Amendment 1682 #
Proposal for a regulation
Article 48 – paragraph 8 a (new)
8a. The EU Center shall not retain the personal data contained in the reports it receives for a period longer than two working days. This period may be extended by up to one week where duly justified and documented.
2023/07/28
Committee: LIBE
Amendment 1683 #
Proposal for a regulation
Article 48 – paragraph 8 b (new)
8b. The EU Center shall keep logs for any of the following processing operations in automated processing systems: the entry, alteration, access, consultation, disclosure, combination and erasure of personal data. The logs of consultation and disclosure shall make possible to establish the justification for, and the date and time of, such operations, the identification of the person who consulted or disclosed operational personal data, and, as far as possible, the identity of the recipients. These logs shall be used for verification of the lawfulness of processing, self-monitoring, and for ensuring its integrity and security. These logs shall be made available to the EU Centre’s data protection officer and to the EDPS on request. Such logs shall be deleted after three years, unless they are required for ongoing control.
2023/07/28
Committee: LIBE
Amendment 1687 #
Proposal for a regulation
Article 49 – paragraph 1 – introductory part
1. The EU Centre shall have the power to conduct searches of publicly accessible content on hosting services for the dissemination of publicly accessibleknown child sexual abuse material, using the relevant indicators from the database of indicators referred to in Article 44(1), points (a) and (b), in the following situations:
2023/07/28
Committee: LIBE
Amendment 1690 #
Proposal for a regulation
Article 49 – paragraph 1 – point b a (new)
(ba) proactively of its own initiative by systematically and automatically analysing and following publicly accessible uniform resource locators (web crawling).
2023/07/28
Committee: LIBE
Amendment 1702 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 3
Before including specific technologies on those lists, the EU Centre shall request the authoritative opinion of its Technology Committee and of the European Data Protection Board, which it shall fully take into account. The Technology Committee and the European Data Protection Board shall deliver their respective opinions within eight weeks. That period may be extended by a further six weeks where necessary, taking into account the complexity of the subject matter. The Technology Committee and the European Data Protection Board shall inform the EU Centre of any such extension within one month of receipt of the request for consultation, together with the reasons for the delay. EU Center shall inform the European Data Protection Board of the action it has taken following its opinion, which shall have the right to object to the inclusion of the specific technology in the lists if it deems that its opinion has not been duly taken into consideration. This opinion shall be notwithstanding the case- by-case assessment of the intended processing by the relevant controller under articles 35 and 36 of Regulation 2016/679.
2023/07/28
Committee: LIBE
Amendment 1707 #
Proposal for a regulation
Article 50 – paragraph 2 – introductory part
2. The EU Centre shall collect, record, analyse andggregate, analyse and proactively make available relevant, objective, reliable and comparable information on matters related to the prevention and combating of child sexual abuse to relevant bodies, Member States, EU institutions and relevant civil society organisations and research institutes, in particular:
2023/07/28
Committee: LIBE
Amendment 1710 #
Proposal for a regulation
Article 50 – paragraph 2 – point a
(a) information obtained in the performance of its tasks under this Regulation concerning detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 1712 #
Proposal for a regulation
Article 50 – paragraph 2 – point c a (new)
(ca) information obtained in the performance of its tasks under this Regulation concerning victim assistance and support.
2023/07/28
Committee: LIBE
Amendment 1714 #
Proposal for a regulation
Article 50 – paragraph 3 a (new)
3a. The outcome of researches, surveys or studies carried out or led by the EU Centre shall be made publicly available.
2023/07/28
Committee: LIBE
Amendment 1715 #
Proposal for a regulation
Article 50 – paragraph 4
4. The EU Centre shall provide the information referred to in paragraph 2 and the information resulting from the research, surveys and studies referred to in paragraph 3, including its analysis thereof, and its opinions on matters related to the prevention and combating of online child sexual abuse to other Union institutions, bodies, offices and agencies, Coordinating Authorities, other competent authorities and other public authorities of the Member States, either on its own initiative or at request of the relevant authority. Where appropriate, the EU Centre shall make such information publicly available.deleted
2023/07/28
Committee: LIBE
Amendment 1717 #
Proposal for a regulation
Article 50 – paragraph 5
5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. Communication campaigns shall be easily understandable and accessible to all children, their families and educators in formal, and non-formal education in the Union, aiming to improve digital literacy and foster a safe digital environment for children.
2023/07/28
Committee: LIBE
Amendment 1719 #
Proposal for a regulation
Article 51 – title
Processing activities and data protectionprinciples of processing
2023/07/28
Committee: LIBE
Amendment 1732 #
Proposal for a regulation
Article 51 – paragraph 3
3. The EU Centre shall store the personal data referred to in paragraph 2 only where and for as long as strictly necessary for the applicable purposes listed in paragraph 2.deleted
2023/07/28
Committee: LIBE
Amendment 1733 #
Proposal for a regulation
Article 51 – paragraph 3 a (new)
3a. Personal data referred to in paragraph 2 shall be processed under the following principles. They shall be (a) processed lawfully and fairly (‘lawfulness and fairness’); (b) collected for specified, explicit and legitimate purposes and not processed in a manner that is incompatible with those purposes (‘purpose limitation’); (c) adequate, relevant, and not excessive in relation to the purposes for which they are processed (‘data minimisation’); (d) accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay (‘accuracy’); (e) kept in a form which permits identification of data subjects for no longer than is strictly necessary for the purposes for which the personal data are processed (‘storage limitation’).
2023/07/28
Committee: LIBE
Amendment 1734 #
Proposal for a regulation
Article 51 – paragraph 4
4. It shall ensure that the personal data is stored in a secure manner and that the storage is subject to appropriate technical and organisational safeguards. Those safeguards shall ensure, in particular, that the personal data can be accessed and processed only for the purpose for which it is stored, that a high level of security is achieved and that the personal data is deleted when no longer strictly necessary for the applicable purposes. It shall regularly review those safeguards and adjust them where necessary.deleted
2023/07/28
Committee: LIBE
Amendment 1736 #
Proposal for a regulation
Article 51 a (new)
Article51a Data protection and security 1. The EU Centre shall adopt the necessary measures, including a security plan and a disaster recovery plan for its IT Systems, databases and the Communication Infrastructure in order to: (a) physically protect data, including by making contingency plans for the protection of critical infrastructure; (b) deny unauthorised persons access to data-processing facilities used for processing personal data (facilities access control); (c) prevent the unauthorised reading, copying, modification or removal of data media (data media control); (d) prevent the unauthorised input of data and the unauthorised inspection, modification or deletion of stored personal data (storage control); (e) prevent the use of automated data- processing systems by unauthorised persons using data communication equipment (user control); (f) prevent the unauthorised processing of data in the databases and any unauthorised modification or erasure of data processed in the databases (control of data entry); (g) ensure that persons authorised to use an automated data-processing system have access only to the data covered by their access authorisation by means of individual and unique user identifiers and confidential access modes only (data access control); (h) create profiles describing the functions and responsibilities of all persons who are authorised to access the data or the data processing facilities and make those profiles and any other relevant information for supervisory purposes available to the European Data Protection Supervisor without delay upon its request (personnel profiles); (i) ensure that it is possible to verify and establish to which bodies personal data may be transmitted using data communication equipment (communication control); (j) ensure that it is subsequently possible to verify and establish which personal data have been input into automated data- processing systems, when and by whom (input control); (k) prevent the unauthorised reading, copying, modification or deletion of personal data during the transmission of personal data or during the transport of data media, in particular by means of appropriate encryption techniques (transport control); (l) monitor the effectiveness of the security measures referred to in this paragraph and take the necessary organisational measures related to internal monitoring to ensure compliance with this Regulation (self-auditing) and to automatically detect within 24 hours any relevant events arising from the application of measures listed in points (b) to (k) that might indicate the occurrence of a security incident; (m) ensure that, in the event of interrupted operations, installed systems can be restored to normal operation (recovery); (n) ensure that the databases performs their functions correctly, that faults are reported (reliability) and that personal data stored in the databases cannot be corrupted by means of the system malfunctioning (integrity); and (o) ensure the security of its technical sites. 2. The EU Centre shall take measures equivalent to those referred to in paragraph 1 as regards security in respect of the processing and exchange of supplementary information through the Communication Infrastructure.
2023/07/28
Committee: LIBE
Amendment 1738 #
Proposal for a regulation
Article 53 – paragraph 1
1. Where necessary for the performance of its tasks under this Regulation, within their respective mandates, the EU Centre shall cooperate withseek advice from Europol.
2023/07/28
Committee: LIBE
Amendment 1740 #
Proposal for a regulation
Article 53 – paragraph 2
2. Europol and the EU Centre shall provide each other with the fullest possible access to relevant information and information systems, where necessary for the performance of their respective tasks and in accordance with the acts of Union law regulating such access. Without prejudice to the responsibilities of the Executive Director, the EU Centre shall maximise efficiency by sharing administrative functions with Europol, including functions relating to personnel management, information technology (IT) and budget implementation.deleted
2023/07/28
Committee: LIBE
Amendment 1750 #
Proposal for a regulation
Article 53 – paragraph 3
3. The terms of cooperation and working arrangements shall be laid down in a memorandum of understanding.deleted
2023/07/28
Committee: LIBE
Amendment 1763 #
Proposal for a regulation
Article 55 – paragraph 1 – point d a (new)
(da) a Survivors Advisory Board, which shall exercise the tasks set out in Article 66a.
2023/07/28
Committee: LIBE
Amendment 1766 #
Proposal for a regulation
Article 56 – paragraph 1
1. The Management Board shall be composed of one representative from each Member State and twoone representatives of the Commission and one representative of the European Parliament, all as members with voting rights.
2023/07/28
Committee: LIBE
Amendment 1769 #
Proposal for a regulation
Article 56 – paragraph 1 – subparagraph 1 (new)
One member of the Technology Commitee and one member of the Survivors Advisory Board as established in Articles 66 and 66a may attend the meetings of the Management Board as observers.
2023/07/28
Committee: LIBE
Amendment 1770 #
Proposal for a regulation
Article 56 – paragraph 2 – subparagraph 1
The Management Board shall also include one independent expert observer designated by the European Parliament, without the right to vote.deleted
2023/07/28
Committee: LIBE
Amendment 1771 #
Proposal for a regulation
Article 56 – paragraph 2 – subparagraph 2
Europol may designate a representative to attend the meetings of the Management Board as an observer on matters involving Europol, at the request of the Chairperson of the Management Board.deleted
2023/07/28
Committee: LIBE
Amendment 1774 #
Proposal for a regulation
Article 56 – paragraph 3
3. Each member of the Management Board shall have an alternate. The alternate shall represent the member in his/their absence.
2023/07/28
Committee: LIBE
Amendment 1776 #
Proposal for a regulation
Article 56 – paragraph 4
4. Members of the Management Board and their alternates shall be appointed in the light of their knowledgproven expertise in the field of preventing and combating child sexual abuse and victim support, taking into account relevant managerial, administrative and budgetary skills. Member States shall appoint a representative of their Coordinating Authority, within four months of [date of entry into force of this Regulation]. All parties represented in the Management Board shall make efforts to limit turnover of their representatives, in order to ensure continuity of its work. All parties shall aim to achieve a balanced representation between men and women on the Management Board.
2023/07/28
Committee: LIBE
Amendment 1777 #
Proposal for a regulation
Article 57 – paragraph 1 – point c
(c) adopt rules for the prevention and management of conflicts of interest in respect of its members, as well as for the members of the Technological Committee and of any other advisory group it may establishthe Survivors’ Advisory Board and publish annually on its website the declaration of interests of the members of the Management Board;
2023/07/28
Committee: LIBE
Amendment 1783 #
Proposal for a regulation
Article 57 – paragraph 1 – point h a (new)
(ha) consult the Survivors’ Advisory Board as regards the obligations referred to in points (a), and (h) of this Article.
2023/07/28
Committee: LIBE
Amendment 1787 #
Proposal for a regulation
Article 61 – paragraph 1 – subparagraph 1
The Executive Board shall be composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and twoone representatives of the Commission toand the Management BoardEuropean Parliament respectively. The Chairperson of the Management Board shall also be the Chairperson of the Executive Board.
2023/07/28
Committee: LIBE
Amendment 1790 #
Proposal for a regulation
Article 62 – paragraph 2 – point p
(p) authorise the conclusion of memoranda of understanding referred to in Article 53(3) and Article 54(2).
2023/07/28
Committee: LIBE
Amendment 1792 #
Proposal for a regulation
Article 64 – paragraph 4 – point e a (new)
(ea) implementing gender mainstreaming and gender budgeting in all areas, including drafting a gender action plan (GAP);
2023/07/28
Committee: LIBE
Amendment 1793 #
Proposal for a regulation
Article 64 – paragraph 4 – point f
(f) preparing the Consolidated Annual Activity Report (CAAR) on the EU Centre’s activities, including the activities of the Technology Committee and the Survivors’ Advisory Board, and presenting it to the Executive Board for assessment and adoption;
2023/07/28
Committee: LIBE
Amendment 1794 #
Proposal for a regulation
Article 64 – paragraph 4 – point g
(g) preparing an action plan following- up conclusions of internal or external audit reports and evaluations, as well as investigations by the European Anti-Fraud Office (OLAF) and by the European Public Prosecutor’s Office (EPPO) and reporting on progress twice a year to the Commission and the European Parliament and regularly to the Management Board and the Executive Board;
2023/07/28
Committee: LIBE
Amendment 1795 #
Proposal for a regulation
Article 66 – paragraph 1
1. The Technology Committee shall consist of technical and data protection experts appointed by the Management Board in view of their excellence and their independence from corporate interests, following the publication of a call for expressions of interest in the Official Journal of the European Union. Its members shall be appointed for a term of four years, renewable once. On the expiry of their term of office, members shall remain in office until they are replaced or until their appointments are renewed. If a member resigns before the expiry of his or her term of office, he or she shall be replaced for the remainder of the term by a member appointed by the Management Board.
2023/07/28
Committee: LIBE
Amendment 1800 #
Proposal for a regulation
Article 66 – paragraph 4
4. When a member no longer meets the criteria of independence, he or she shall inform the Management Board. Alternatively, the Management Board may declare, on a proposal of at least one third of its members or of the Commission, a lack of independence and revoke appointment of the person concerned. The Management Board shall appoint a new member for the remaining term of office in accordance with the procedure for ordinary members.
2023/07/28
Committee: LIBE
Amendment 1805 #
Proposal for a regulation
Article 66 a (new)
Article66a Establishment and tasks of the Survivors Advisory Board 1. The Survivors’ Advisory Board shall consist of seven members who are either survivors and victims of child sexual abuse or experts on the needs of survivors and victims of child sexual abuse, and shall be appointed by the Management Board in view of their personal experience if applicable, expertise and scope of work, following the publication of a call for expressions of interest in the Official Journal of the European Union. The Survivors Advisory Board shall ensure representation of all protected characteristics. 2. Procedures concerning the appointment of the members of the Survivors’ Advisory Board and its operation shall be further specified in the rules of procedure of the Management Board and shall be made public. 3. The members of the Survivors’ Advisory Board shall act in the interest of child sexual abuse victims. The EU Agency shall publish the list of members of the Survivors’ Advisory Board on its website and keep it up to date. 4. If a member no longer meets the criterion of independence, he or she shall inform the Management Board. The Management Board may, on the proposal of at least one third of its members or of the Commission, determine a lack of independence and revoke the appointment of the person concerned. The Management Board shall appoint a new member for the remaining term of office in accordance with the procedure applicable to ordinary members. If a member resigns before the expiry of his or her term of office, he or she shall be replaced for the remaining term of office in accordance with the procedure applicable to ordinary members. 5. The term of office of the members of the Survivors’ Advisory Board shall be four years. It may be renewed once. 6. The Executive Director and the Management Board shall consult the Survivors Advisory Board on any matter relating to victims rights and preventing and combating child sexual abuse, and they shall give a structural consult at least twice a year. 7. The Survivors’ Advisory Board shall have the following tasks: (a) ensure visibility of the interests and needs of survivors and victims of child sexual abuse; (b) advise the Management Board on matters set out in Article 57 point (h a); (c) advise the Executive Director and the Management Board as set out in paragraph 6 of this Article; (d) contribute experience and expertise in preventing and combating child sexual abuse and victim support and assistance; (e) serve as a platform to exchange and connect for survivors of child sexual abuse; (f) provide an annual activity report to the Executive Director as part of the Consolidated Annual Activity Report.
2023/07/28
Committee: LIBE
Amendment 1810 #
Proposal for a regulation
Article 83 – paragraph 1 – introductory part
1. Providers of hosting services, and providers of number-independent interpersonal communications services and providers of internet access services shall collect data on the following topics and make that information available to the EU Centre upon requestpublic:
2023/07/28
Committee: LIBE
Amendment 1813 #
Proposal for a regulation
Article 83 – paragraph 1 – point a – indent 1
– the measures taken to comply with the order, including the technologies used for that purpose and the safeguards provided;
2023/07/28
Committee: LIBE
Amendment 1814 #
Proposal for a regulation
Article 83 – paragraph 1 – point a – indent 2
– the errorfalse positives and false negative rates of the technologies deployed to detect online child sexual abuse and measuresteps taken to prevent or remedy any errorsmitigate the harm caused by any inaccuracy;
2023/07/28
Committee: LIBE
Amendment 1817 #
Proposal for a regulation
Article 83 – paragraph 1 – point b
(b) the number of removal orders issued to the provider in accordance with Article 14 and the average time needed for removing or disabling access to the item or items of child sexual abuse material in question, counting from the moment the order entered the provider’s system;
2023/07/28
Committee: LIBE
Amendment 1819 #
Proposal for a regulation
Article 83 – paragraph 1 – point b a (new)
(ba) the number and duration of delays to removals as a result of requests from competent authorities or law enforcement authorities;
2023/07/28
Committee: LIBE
Amendment 1820 #
Proposal for a regulation
Article 83 – paragraph 1 – point c
(c) the total number of items of child sexual abuse material that the provider removed or to which it disabled access, broken down by whether the items were removed or access thereto was disabled pursuant to a removal order or to a notice submitted by a Judicial Authority, Competent Authority, the EU Centre or a third party, a national hotline, a trusted flagger, or a private individual or at the provider’s own initiative;
2023/07/28
Committee: LIBE
Amendment 1822 #
Proposal for a regulation
Article 83 – paragraph 1 – point c a (new)
(ca) the number of instances that the provider was asked to provide additional support to law enforcement authorities in relation to content that was removed;
2023/07/28
Committee: LIBE
Amendment 1823 #
Proposal for a regulation
Article 83 – paragraph 1 – point d
(d) the number of blocking orders issued to the provider in accordance with Article 16;deleted
2023/07/28
Committee: LIBE
Amendment 1827 #
Proposal for a regulation
Article 83 – paragraph 1 – point e
(e) the number of instances in which the provider invoked Article 8(3), Article 14(5) or (6) or Article 17(5), together with the grounds therefor;
2023/07/28
Committee: LIBE
Amendment 1831 #
Proposal for a regulation
Article 83 – paragraph 2 – introductory part
2. The Coordinating Authorities shall collect data on the following topics and make that information available to the EU Centre upon requestpublicly available, redacting operationally sensitive data as appropriate and proving an unredacted version to the EU Centre:
2023/07/28
Committee: LIBE
Amendment 1832 #
Proposal for a regulation
Article 83 – paragraph 2 – point a – indent -1 (new)
-1 the nature of the report and its key characteristics such as if the security of the hosting service was allegedly breached;
2023/07/28
Committee: LIBE
Amendment 1835 #
Proposal for a regulation
Article 83 – paragraph 2 – point b
(b) the most important and recurrent risktypes and characteristics of online child sexual abuse encountered, as reported by providers of hosting services and providers of number-independent interpersonal communications services in accordance with Article 3 or identified through other information available to the Coordinating Authority;
2023/07/28
Committee: LIBE
Amendment 1839 #
Proposal for a regulation
Article 83 – paragraph 2 – point c
(c) a list of the providers of hosting services and providers of number- independent interpersonal communications services to which the Coordinating Authority addressed a detecn investigation order in accordance with Article 7;
2023/07/28
Committee: LIBE
Amendment 1843 #
Proposal for a regulation
Article 83 – paragraph 2 – point f
(f) the number of removal orders issued in accordance with Article 14, broken down by provider, the time needed to remove or disable access to the item or items of child sexual abuse material concerned, including the time it took the Coordinating Authority to process the order, and the number of instances in which the provider invoked Article 14(5) and (6);
2023/07/28
Committee: LIBE
Amendment 1847 #
Proposal for a regulation
Article 83 – paragraph 2 – point g
(g) the number of blocking orders issued in accordance with Article 16, broken down by provider, and the number of instances in which the provider invoked Article 17(5);deleted
2023/07/28
Committee: LIBE
Amendment 1852 #
Proposal for a regulation
Article 83 – paragraph 3 – introductory part
3. The EU Centre shall collect data and generate statistics on the detection, reporting, removal of or disabling of access to online child sexual abuse under this Regulation. The data shall be in particular on the following topicsinclude:
2023/07/28
Committee: LIBE
Amendment 1855 #
Proposal for a regulation
Article 83 – paragraph 3 – point a
(a) the number of indicators in the databases of indicators referred to in Article 44 and the developmentchange of that number as compared to previous years;
2023/07/28
Committee: LIBE
Amendment 1857 #
Proposal for a regulation
Article 83 – paragraph 3 – point b
(b) the number of submissions of child sexual abuse material and solicitation of children referred to in Article 36(1), broken down by Member State that designated the submitting Coordinating Authorities, and, in the case of child sexual abuse material, the number of indicators generated on the basis thereof and the number of uniform resource locators included in the list of uniform resource locators in accordance with Article 44(3);
2023/07/28
Committee: LIBE
Amendment 1859 #
Proposal for a regulation
Article 83 – paragraph 3 – point c
(c) the total number of reports submitted to the EU Centre in accordance with Article 12, broken down by provider of hosting services and provider of number-independent interpersonal communications services that submitted the report and by Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3);
2023/07/28
Committee: LIBE
Amendment 1861 #
Proposal for a regulation
Article 83 – paragraph 3 – point c a (new)
(ca) the total number of reports forwarded to Europol in accordance with Article 48, as well as the number of access requests received from Europol under Article 46(4) and 46(5), including the number of those requests granted and refused by the EU Centre.
2023/07/28
Committee: LIBE
Amendment 1862 #
Proposal for a regulation
Article 83 – paragraph 3 – point d
(d) the online child sexual abuse to which the reports relate, including the number of items of potential known and new child sexual abuse material and instances of potential solicitation of children, the Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3), and type of relevant information society service that the reporting provider offers;
2023/07/28
Committee: LIBE
Amendment 1864 #
Proposal for a regulation
Article 83 – paragraph 3 – point e
(e) the number of reports that the EU Centre considered unfounded or manifestly unfounded, as referred to in Article 48(2);
2023/07/28
Committee: LIBE
Amendment 1866 #
Proposal for a regulation
Article 83 – paragraph 3 – point f
(f) the number of reports relating to potential newunknown child sexual abuse material and solicitation of children that were assessed as not constituting child sexual abuse material of which the EU Centre was informed pursuant to Article 36(3), broken down by Member State;
2023/07/28
Committee: LIBE
Amendment 1868 #
Proposal for a regulation
Article 83 – paragraph 3 – point h
(h) where materially the same item of potential child sexual abuse material was reported more than once to the EU Centre in accordance with Article 12 or detected more than once through the searches in accordance with Article 49(1), the number of times that that item was reported or detected in that manner.
2023/07/28
Committee: LIBE
Amendment 1869 #
Proposal for a regulation
Article 83 – paragraph 3 – point j
(j) the number of victims of online child sexual abuse assisted by the EU Centre pursuant to Article 21(2), and the number of these victims that requested to receive such assistance in a manner accessible to them due to disabilities.
2023/07/28
Committee: LIBE
Amendment 1872 #
Proposal for a regulation
Article 83 – paragraph 4
4. The providers of hosting services, and providers of number-independent interpersonal communications services and providers of internet access services, the Coordinating Authorities and the EU Centre shall ensure that the data referred to instored pursuant to paragraphs 1, 2 and 3, respectively, is stored no longer than is necessary for the transparency reporting referred to in Article 84. The data stored shall not contain any personal data.
2023/07/28
Committee: LIBE
Amendment 1876 #
Proposal for a regulation
Article 83 – paragraph 5
5. They shall ensure that the data is stored in a secure manner and that the storage is subject to appropriate technical and organisational safeguards. Those safeguards shall ensure, in particular, that the data can be accessed and processed only for the purpose for which it is stored, that a high level of security is achieved and that the information is deleted when no longer necessary for that purpose. All access to this data shall be logged and the logs securely stored for five years. They shall regularly review those safeguards and adjust them where necessary.
2023/07/28
Committee: LIBE
Amendment 1878 #
Proposal for a regulation
Article 84 – paragraph 1
1. Each provider of relevant information society services shall draw up an annual report on its activities under this Regulation. That report shall compile the information referred to in Article 83(1). The providers shall, by 31 JanuaryMarch of every year subsequent to the year to which the report relates, make the report available to the public and communicate it to the Coordinating Authority of establishment, the Commission and the EU Centre.
2023/07/28
Committee: LIBE