101 Amendments of Marion WALSMANN related to 2022/0155(COD)
Amendment 159 #
Proposal for a regulation
Recital 1
Recital 1
(1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for. But they are also used by perpetrators of child sexual abuse offences. Such offences, which are subject to minimum rules set at Union level, are verextremely serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), and to protect society at large. Users of such services offered in the Union should be able to trust that the services concerned can be used safely, especially by children.
Amendment 160 #
Proposal for a regulation
Recital 1 a (new)
Recital 1 a (new)
(1 a) Regulatory measures to address the dissemination of child sexual abuse content online should be complemented by Member States strategies including increasing public awareness, how to seek child-friendly and age appropriate reporting and assistance and informing about victims rights. Additionally Member States should make sure they have a child-friendly justice system in place in order to avoid further victimisation of the abused children.
Amendment 162 #
Proposal for a regulation
Recital 2
Recital 2
(2) Given the central importance of relevant information society services for the digital single market, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, effective, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services.
Amendment 163 #
Proposal for a regulation
Recital 3
Recital 3
(3) Member States aOn the one hand it is very positive that Member States are aware of the existing problem and therefore increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. IOn the light of the inherently cross-border nature ofother hand the internet and the service provision concerned, those national laws, which diverge have an inherently cross-border nature and therefore diverging national laws, have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
Amendment 166 #
Proposal for a regulation
Recital 4
Recital 4
(4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced, carefully balanced and proportionate rules to prevent and combat child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future- proof manner, so as not to hamper innovation.
Amendment 171 #
Proposal for a regulation
Recital 7
Recital 7
(7) This Regulation should be without prejudice to the rules resulting from other Union acts, in particular Directive 2011/93 of the European Parliament and of the Council38, Directive 2000/31/EC of the European Parliament and of the Council39and Regulation (EU) …/…2022/ 2065of the European Parliament and of the Council40[on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC], Directive 2010/13/EU of the European Parliament and of the Council41, Regulation (EU) 2016/679 of the European Parliament and of the Council42, and Directive 2002/58/EC of the European Parliament and of the Council43. _________________ 38 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1). 39 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1). 40 Regulation (EU) …/… of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (OJ L ….). 41 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media service (OJ L 95, 15.4.2010, p. 1). 42 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (OJ L 119, 4.5.2016, p. 1). 43 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (‘Directive on privacy and electronic communications’) (OJ L 201, 31.7.2002, p. 37).
Amendment 172 #
Proposal for a regulation
Recital 8
Recital 8
(8) This Regulation should be considered lex specialis in relation to the generally applicable framework set out in Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065 laying down harmonised rules on the provision of certain information society services in the internal market. The rules set out in Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065 apply in respect of issues that are not or not fully addressed by this Regulation.
Amendment 174 #
Proposal for a regulation
Recital 10
Recital 10
(10) In the interest of clarity and consistency, the definitions provided for in this Regulation should, where possible and appropriate, be based on and aligned with the relevant definitions contained in other acts of Union law, such as Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065.
Amendment 176 #
Proposal for a regulation
Recital 13
Recital 13
(13) TIn order to allow a future proof approach the term ‘online child sexual abuse’ should cover not only the dissemination of material previously detected and confirmed as constituting child sexual abuse material (‘known’ material), but also of material not previously detected that is likely to constitute child sexual abuse material but that has not yet been confirmed as such (‘new’ material), including live-streaming and live transmission of child sexual abuse material as well as activities constituting the solicitation of children (‘grooming’). That is urgently needed in order to address not only past abuse, the re- victimisation and violation of the victims’ rights it entails, such as those to privacy and protection of personal data, but to also prevent it as soon as possible and address recent, ongoing and imminent abuse, so as to prevent it as much as possible, to effectively protect children and to increase the likelihood of rescuing victims and stopping perpetrators as quick as possible .
Amendment 177 #
Proposal for a regulation
Recital 13 a (new)
Recital 13 a (new)
(13 a) Member States should ensure that they additionally address the problem of solicitation of children by providing for efficient digital education. Children should be given at home and in school the necessary digital skills and tools they need to fully benefit from online access, whilst ensuring their safety.
Amendment 180 #
Proposal for a regulation
Recital 15
Recital 15
(15) Some of those providers of relevant information society services in scope of this Regulation, including online search engines, may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065 with respect to information that they store and disseminate to the public. For the purposes of the present Regulation, those providers may draw onuse such a risk assessment as a basis and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as required by this Regulation.
Amendment 183 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065 may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification andassessment measures and age appropriate parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 185 #
Proposal for a regulation
Recital 16 a (new)
Recital 16 a (new)
(16 a) The used age assessing tools should be able to prove age in an efficient, privacy-preserving and secure manner.
Amendment 188 #
Proposal for a regulation
Recital 17
Recital 17
(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement also voluntary measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are freeshould be able to voluntary process metadata and are encouraged to design and implement, in accordance with Union law, measures based on their existing practices to detect online child sexual abuse in their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authority.
Amendment 194 #
Proposal for a regulation
Recital 18
Recital 18
(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also and to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices, including as established through self- regulatory cooperation, and those contained in guidelines from the Commission. Providers should be able to indicate to the Coordinating Authority their assessment of the need for a detection order, after putting in place the risk mitigation measures. When no risk has been detected after a diligently conducted or updated risk assessment, providers should not be required to take any mitigation measures.
Amendment 197 #
Proposal for a regulation
Recital 19
Recital 19
(19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable and effective measures to assess and mitigate that risk. The providers should make that assessment in a diligent manner, making efforts that are reasonable and effective under the given circumstances, having regard inter alia to the nature and extent of that risk as well as their financial and technological capabilities and size, and cooperating with the providers of the services offered through the software application where possible.
Amendment 199 #
Proposal for a regulation
Recital 20
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be a last resort measure and subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that in particular solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.
Amendment 202 #
Proposal for a regulation
Recital 21
Recital 21
(21) Furthermore, as parts of those limits and safeguards, detection orders should only be issued after a diligent and objective assessment leading to the finding of a significant risk of the specific service concerned being misused for a given type of online child sexual abuse covered by this Regulation. Such assessments may include the voluntary use of detection technologies and the evidence they provide with regard to the risks of a service being misused. One of the elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection order.
Amendment 204 #
Proposal for a regulation
Recital 22
Recital 22
(22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
Amendment 206 #
Proposal for a regulation
Recital 23
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted, justified, proportionate, limited in time and specified so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
Amendment 209 #
Proposal for a regulation
Recital 24
Recital 24
(24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, should be in a position to take a well- informed decision on requests for the issuance of detections orders. That is of particulargreat importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especiallythis is in particular significant in connection to detection orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question. They should do so as soon as possible, having regard to the important public policy objective at stake and the need to act without undue delay to protect children. In particular, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for providing their opinions in response to a prior consultation. Furthermore, they should normally be able to provide their opinion well within that time period in situations where the European Data Protection Board has already issued guidelines regarding the technologies that a provider envisages deploying and operating to execute a detection order addressed to it under this Regulation.
Amendment 211 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Nothing in this Regulation should therefore be interpreted as making end-to-end encryption impossible, in particular considering that technologies that allow the effective detection of online child sexual abuse in end-to-end encrypted communications already exist, and make it possible to balance all the fundamental rights at stake. These rights include on one hand the right to physical and mental integrity of children (Article 3 of the EU Charter of Fundamental Rights (the ‘Charter’)), the prohibition of torture and inhuman and degrading treatment (Article 4 Charter), their right to such protection and care as is necessary for their well-being (Article 24 Charter), their right to respect for their private and family life (Article 7 EU Charter) as well as to protection of their personal data (Article 8 Charter). The rights also include, on the other hand, the right to respect for private and family life (Article 7 Charter), to protection of personal data (Article 8 Charter), and the freedom of expression (Article 11 Charter) of the other users of the online services concerned. Finally, the rights at stake also include the freedom to conduct a business (Article 16 Charter) of the online service providers that fall within the scope of the proposal. The Commission should in cooperation with the EU Centre consider making available effective detection of online child sexual abuse in end-to-end encrypted communications tools. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
Amendment 217 #
Proposal for a regulation
Recital 29 a (new)
Recital 29 a (new)
(29 a) In order to ensure effective prevention and fight against online child sexual abuse the providers should be able to make voluntary use of detection technologies as part of their mitigation measues, if they assess this as necessary in order to limit the risk of misuse.
Amendment 218 #
Proposal for a regulation
Recital 29 b (new)
Recital 29 b (new)
(29 b) All relevant providers should provide for easily accessible, child- friendly and age appropriate notification mechanisms that allow for a quick, efficient and privacy-preserving notification. Micro, small and medium sized enterprises should get support from the EU Centre to build up a corresponding mechanism.
Amendment 219 #
Proposal for a regulation
Recital 31
Recital 31
(31) The rules of this Regulation should not be understood as affecting the requirements regarding removal orders set out in Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065.
Amendment 222 #
Proposal for a regulation
Recital 34
Recital 34
(34) CIn order to allow for an efficient reporting system and considering that acquiring, possessing, knowingly obtaining access and transmitting child sexual abuse material constitute criminal offences under Directive 2011/93/EU, it is necessary to exempt providers of relevant information society services from criminal liability when they are involved in such activities, including taking voluntary measures, insofar as their activities remain strictly limited to what is needed for the purpose of complying with their obligations under this Regulation and they act in good faith.
Amendment 223 #
Proposal for a regulation
Recital 40
Recital 40
(40) In order to facilitate smooth and efficient communications by electronic means, including, where relevant, by acknowledging the receipt of such communications, relating to matters covered by this Regulation, providers of relevant information society services should be required to designate a single point of contact and to publish relevant information relating to that point of contact, including the languages to be used in such communications. In contrast to the provider’s legal representative, the point of contact should serve operational purposes and should not be required to have a physical location. Suitable conditions should be set in relation to the languages of communication to be specified, so as to ensure that smooth communication is not unreasonably complicated. For providers subject to the obligation to establish a compliance function and nominate compliance officers in accordance with Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065, one of these compliance officers may be designated as the point of contact under this Regulation, in order to facilitate coherent implementation of the obligations arising from both frameworks.
Amendment 224 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where relevant and convenient, subject to the choice of the provider of relevant information society services and the need to meet the applicable legal requirements in this respect, it should be possible for those providers to designate a single point of contact and a single legal representative for the purposes of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065 and this Regulation.
Amendment 231 #
Proposal for a regulation
Recital 69 a (new)
Recital 69 a (new)
(69 a) Hotlines play an invaluable role in providing the public with a way to report suspected child sexual abuse material and by rapidly removing harmful content online, but they have different legal rights to process child sexual abuse material and therefore Member Stats are encouraged to aim for a harmonisation of the legal capacities of hotlines.
Amendment 233 #
Proposal for a regulation
Recital 70
Recital 70
(70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines, concluding, when necessary, strategic and/or operational cooperation agreements with them and encourage that they work togethercooperate and coordinate effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union.
Amendment 236 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to prevent and address the misuse of relevant information society services for online child sexual abuse in the internaland ensure the smooth functioning of a digital single market.
Amendment 243 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
(d a) obligations on providers of online search engines to delist websites indicating child sexual abuse material;
Amendment 245 #
Proposal for a regulation
Article 1 – paragraph 3 – point b
Article 1 – paragraph 3 – point b
(b) Directive 2000/31/EC and Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065;
Amendment 249 #
Proposal for a regulation
Article 2 – paragraph 1 – point a
Article 2 – paragraph 1 – point a
(a) ‘hosting service’ means an information society service as defined in Article 23, point (fg), third indent, of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065;
Amendment 257 #
Proposal for a regulation
Article 2 – paragraph 1 – point c
Article 2 – paragraph 1 – point c
(c) ‘software application’ means a digital product or service as defined in Article 2, point 135, of Regulation (EU) …/… [on contestable and fair markets in the digital sector (Digital Markets Act)]2022/1925;
Amendment 258 #
Proposal for a regulation
Article 2 – paragraph 1 – point d
Article 2 – paragraph 1 – point d
(d) ‘software application store’ means a service as defined in Article 2, point 124, of Regulation (EU) …/… [on contestable and fair markets in the digital sector (Digital Markets Act)]2022/1925;
Amendment 263 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
Article 2 – paragraph 1 – point f – point iv a (new)
(iv a) online search engines;
Amendment 265 #
Proposal for a regulation
Article 2 – paragraph 1 – point f a (new)
Article 2 – paragraph 1 – point f a (new)
(f a) “Online search engine” means an intermedietary service as defined in Article 3 point (j) of Regulation (EU) 2022/2065;
Amendment 266 #
Proposal for a regulation
Article 2 – paragraph 1 – point f b (new)
Article 2 – paragraph 1 – point f b (new)
(f b) ‘metadata‘ means data processed for the purposes of transmitting, distributing or exchanging content data; including data used to trace and identify the source and destination of a communication, data on the location of the user, and the date, time, duration and the type of communication;
Amendment 271 #
Proposal for a regulation
Article 2 – paragraph 1 – point r
Article 2 – paragraph 1 – point r
(r) ‘recommender system’ means the system as defined in Article 23, point (os), of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065;
Amendment 272 #
Proposal for a regulation
Article 2 – paragraph 1 – point t
Article 2 – paragraph 1 – point t
(t) ‘content moderation’ means the activities as defined in Article 23, point (pt), of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065;
Amendment 273 #
Proposal for a regulation
Article 2 – paragraph 1 – point v
Article 2 – paragraph 1 – point v
(v) ‘terms and conditions’ means terms and conditions as defined in Article 23, point (qu), of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]2022/2065;
Amendment 291 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
— functionalities enabling age verificationassessing measures;
Amendment 293 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
Article 3 – paragraph 2 – point b – indent 4 a (new)
- functionalities enabling age appropriate parental control;
Amendment 295 #
Proposal for a regulation
Article 3 – paragraph 2 – point b a (new)
Article 3 – paragraph 2 – point b a (new)
(b a) the capacity, in accordance with the state of the art, to deal with reports and notifications about child sexual abuse in a timely manner;
Amendment 310 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 1
Article 3 – paragraph 3 – subparagraph 1
The provider may request the EU Centre to perform an analysis of representative, anonymized data samples to identify potential online child sexual abuse, to support the risk assessment. This request cannot serve the purpose of evading any of the provider’s obligations set up in this Regulation. The EU Centre shall perfom the analysis in a timely manner.
Amendment 312 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 2
Article 3 – paragraph 3 – subparagraph 2
The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise, provided the request is reasonably necessary to support the risk assessment.
Amendment 316 #
Proposal for a regulation
Article 3 – paragraph 3 a (new)
Article 3 – paragraph 3 a (new)
3 a. The provider may also voluntary use the measures specified in Article 10 to detect online child sexual abuse on a specific service. In this case they have to notify the Coordinating authority and include the results of its analyses in a separate section of the risk assessment.
Amendment 328 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(a a) adapting privacy and safety by design and by default for children, including age appropriate parental control tools;
Amendment 334 #
Proposal for a regulation
Article 4 – paragraph 1 – point b a (new)
Article 4 – paragraph 1 – point b a (new)
(b a) processing metadata;
Amendment 336 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] 2022/2065.
Amendment 338 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
Article 4 – paragraph 1 – point c a (new)
(c a) foreseeing awareness-raising measures;
Amendment 339 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
Article 4 – paragraph 1 – point c b (new)
(c b) using any other measures in accordance with the current or future state of the art that are fit to mitigate the identified risk;
Amendment 351 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures. The age assessement tools shall be able to verify the age in an efficient, privacy-preserving and secure manner.
Amendment 358 #
Proposal for a regulation
Article 4 – paragraph 5
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, mayshall issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used.
Amendment 360 #
Proposal for a regulation
Article 4 a (new)
Article 4 a (new)
Article 4 a Legal basis for risk mitigation through metadata processing 1. To the extent necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, providers of number independent interpersonal communication services shall be allowed, as a mitigating measure under Article 4, to process metadata. 2. All relevant service providers shall process metadata when ordered to do so by the Coordinating Authority of establishment in accordance with Article 5bis(4). When assessing whether to require a provider to process metadata, the Coordinating Authority shall take into account the interference with the rights to privacy and data protection of the users of the service that such a processing entails and determine whether, in the case at hand, the processing of metadata would be effective in mitigating the risk of use of the service for the purpose of child sexual abuse, strictly necessary and proportionate. 3. If they process metadata as a risk mitigation measure, providers shall inform their users of such processing in their terms and conditions, including information on the possibility to submit complaints.
Amendment 381 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
Article 6 – paragraph 1 – point c
(c) take the necessary age verification and age assessment measures to reliably identify child users on their services, in an effective, privacy- preserving and secure manner, enabling them to take the measures referred to in point (b).
Amendment 387 #
Proposal for a regulation
Article 6 a (new)
Article 6 a (new)
Article 6 a Encrypted services Member States shall not prevent providers of relevant information society services from offering encrypted services. But when offering them, providers have to make sure that they process metadata in order to detect known child sexual abuse material.
Amendment 389 #
Proposal for a regulation
Article 6 b (new)
Article 6 b (new)
Article 6 b Support for micro and small and medium sized enterprises The Commission shall be empowered to adopt delegated acts in accordance with Article 86 in order to supplement this Regulation with guidelines that foresee practical support for micro and small and medium sized enterprises in order for them to be able to fulfil the obligations of this Regulation.
Amendment 398 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. TAs a last resort, the Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order, limited in time, requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service.
Amendment 400 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 1
Article 7 – paragraph 2 – subparagraph 1
Amendment 402 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – introductory part
Article 7 – paragraph 3 – subparagraph 1 – introductory part
Where the Coordinating Authority of establishment takes the preliminary view that the conditions of paragraph 4 have been met and the measures envisaged in the detection order are proportionate, it shall:
Amendment 405 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point b
Article 7 – paragraph 3 – subparagraph 1 – point b
(b) submit the draft request to the concerned provider and the EU Centre;
Amendment 407 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point d
Article 7 – paragraph 3 – subparagraph 1 – point d
(d) invite the EU Centre to provide its opinion on the draft request, within a time period of fourtwo weeks from the date of receiving the draft request.
Amendment 418 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 3
Article 7 – paragraph 3 – subparagraph 3
Where, having regard to the implementation plan of the provider and the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the issuance of the detection order, adjusted where appropriate, to the competent judicial authority or independent administrative authority. It shall attach the implementation plan of the provider and the opinions of the EU Centre and the data protection authority to that request.
Amendment 420 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
Article 7 – paragraph 4 – subparagraph 1 – introductory part
The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority or independent administrative authority shall issue the detection order where it considers that the following conditions are met:
Amendment 432 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 3
Article 7 – paragraph 4 – subparagraph 3
As regards the second subparagraph, point (d), where that Coordinating Authority substantially deviates from the opinion of the EU Centre, it shall inform the EU Centre and the Commission thereof, specifying in detail the points at which it deviated and the main reasons for the deviation.
Amendment 440 #
Proposal for a regulation
Article 7 – paragraph 6 – point a
Article 7 – paragraph 6 – point a
(a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material, including live stream and live transmission;
Amendment 448 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary to effectively address the significant risk referred to in point (a) thereof.
Amendment 454 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 1
Article 7 – paragraph 9 – subparagraph 1
The competent judicial authority or independent administrative authority shall specify in the detection order the period during which it applies, indicating the start date and the end date.
Amendment 461 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detection orders referred to in Article 7 using the template set out in Annex I. Detection orders shall include:
Amendment 465 #
Proposal for a regulation
Article 8 – paragraph 1 – point b
Article 8 – paragraph 1 – point b
(b) identification details of the competent judicial authority or the independent administrative authority issuing the detection order and authentication of the detection order by that judicial or independent administrative authority;
Amendment 474 #
Proposal for a regulation
Article 8 – paragraph 1 – point i
Article 8 – paragraph 1 – point i
(i) the date, time stamp and electronic signature of the judicial or independent administrative authority issuing the detection order;
Amendment 478 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1
Article 8 – paragraph 2 – subparagraph 1
The competent judicial authority or independent administrative authority issuing the detection order shall address it to the main establishment of the provider or, where applicable, to its legal representative designated in accordance with Article 24.
Amendment 488 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services that have received a detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detection order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
Amendment 491 #
Proposal for a regulation
Article 9 – paragraph 2 – subparagraph 1
Article 9 – paragraph 2 – subparagraph 1
When the detection order becomes final, the competent judicial authority or independent administrative authority that issued the detection order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).
Amendment 498 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 1
Article 9 – paragraph 4 – subparagraph 1
In respect of the detection orders that the competent judicial authority or independent administrative authority issued at its request, the Coordinating Authority of establishment shall, where necessary and in any event following reception of the reports referred to in paragraph 3, assess whether any substantial changes to the grounds for issuing the detection orders occurred and, in particular, whether the conditions of Article 7(4) continue to be met. In that regard, it shall take account of additional mitigation measures that the provider may take to address the significant risk identified at the time of the issuance of the detection order.
Amendment 500 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 2
Article 9 – paragraph 4 – subparagraph 2
That Coordinating Authority shall request to the competent judicial authority or independent administrative authority that issued the detection order the modification or revocation of such order, where necessary in the light of the outcome of that assessment. The provisions of this Section shall apply to such requests, mutatis mutandis.
Amendment 540 #
Proposal for a regulation
Article 12 – paragraph 3
Article 12 – paragraph 3
3. TheAll providers shall establish and operate an easily found, accessible, age- appropriate and user-friendly mechanism that allows users to flag to the provider potential online child sexual abuse on the service. Those mechanisms shall allow for the submission of notices anonymously and exclusively by electronic means and for a clear indication of the exact electronic location of that information.
Amendment 554 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse material.
Amendment 558 #
Proposal for a regulation
Article 14 – paragraph 3 – introductory part
Article 14 – paragraph 3 – introductory part
3. The competent judicial authority or the independent administrative authority shall issue a removal order using the template set out in Annex IV. Removal orders shall include:
Amendment 559 #
Proposal for a regulation
Article 14 – paragraph 3 – point a
Article 14 – paragraph 3 – point a
(a) identification details of the judicial or independent administrative authority issuing the removal order and authentication of the removal order by that authority;
Amendment 562 #
Proposal for a regulation
Article 14 – paragraph 3 – point h
Article 14 – paragraph 3 – point h
(h) the date, time stamp and electronic signature of the judicial or independent administrative authority issuing the removal order;
Amendment 564 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the removal order.
Amendment 566 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1
Article 15 – paragraph 2 – subparagraph 1
When the removal order becomes final, the competent judicial authority or independent administrative authority that issued the removal order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).
Amendment 569 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 1
Article 15 – paragraph 4 – subparagraph 1
The Coordinating Authority of establishment may request, when requesting the judicial authority or independent administrative authority issuing the removal order, and after having consulted with relevant public authorities, that the provider is not to disclose any information regarding the removal of or disabling of access to the child sexual abuse material, where and to the extent necessary to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 570 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 2 – point a
Article 15 – paragraph 4 – subparagraph 2 – point a
(a) the judicial authority or independent administrative authority issuing the removal order shall set the time period not longer than necessary and not exceeding six weeks, during which the provider is not to disclose such information;
Amendment 571 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 2 – point c
Article 15 – paragraph 4 – subparagraph 2 – point c
(c) that judicial authority or independent administrative authority shall inform the provider of its decision, specifying the applicable time period.
Amendment 572 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 3
Article 15 – paragraph 4 – subparagraph 3
That judicial authority or independent administrative authority may decide to extend the time period referred to in the second subparagraph, point (a), by a further time period of maximum six weeks, where and to the extent the non-disclosure continues to be necessary. In that case, that judicial authority or independent administrative authority shall inform the provider of its decision, specifying the applicable time period. Article 14(3) shall apply to that decision.
Amendment 573 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15 a Delisting orders 1. The competent authority shall have the power to issue an order requiring a provider of online search engines under the jurisdiction of that Member State to take reasonable measures to delist a Uniform Resource Locator corresponding to online locations where child sexual abuse material can be found from appearing in search results. 2. The provider shall execute the delisting order without undue delay. The provider shall take the necessary measures to ensure that it is capable of reinstating the Uniform Resource Locator to appear in search results. 3. Before issuing a delisting order, the issuing authority shall inform the provider, if necessary via the Coordinating Authority, of its intention to do so specifying the main elements of the content of the intended delisting order and the reasons for its intention. It shall afford the provider an opportunity to comment on that information, within a reasonable time period set by that authority. 4. A delisting order shall be issued where the following conditions are met: (a) the delisting is necessary to prevent the dissemination of the child sexual abuse material in the Union, having regard in particular to the need to protect the rights of the victims; (b) all necessary investigations and assessments, including of search results, have been carried out to ensure that the Uniform Resource Locator to be delisted correspond, in a sufficiently reliable manner, to online locations where child sexual abuse material can be found. 5. The issuing authority shall specify in the delisting order the period during which it applies, indicating the start date and the end date. The period of application of delisting orders shall not exceed five years. 6. The Coordinating Authority or the issuing authority shall, where necessary and at least once every year, assess whether any substantial changes to the grounds for issuing the delisting orders have occurred and whether the conditions of paragraph 4 continue to be met.
Amendment 574 #
Proposal for a regulation
Article 15 b (new)
Article 15 b (new)
Article 15 b Redress and provision of information 1. Providers of online search engines that have received a delisting order shall have a right to effective redress. That right shall include the right to challenge the delisting order before the courts of the Member State of the authority that issued the delisting order. 2. If the order is modified or repealed as a result of a redress procedure, the provider shall immediately reinstate the delisted Uniform Resource Locator to appearing in search results. 3. When the delisting order becomes final, the issuing authority shall, without undue delay, transmit a copy thereof to the Coordinating Authority. The Coordinating Authority shall then, without undue delay, transmit copies thereof to all other Coordinating Authorities and the EU Centre through the system established in accordance with Article 39(2). For the purpose of the first subparagraph, a delisting order shall become final upon the expiry of the time period for appeal where no appeal has been lodged in accordance with national law or upon confirmation of the delisting order following an appeal. 4. Where a provider prevents users from obtaining search results for child sexual abuse material corresponding to Uniform Resource Locator pursuant to a delisting order, it shall take reasonable measures to inform those users of the following: (a) the fact that it does so pursuant to a delisting order; (b) the right of providers of delisted Uniform Resource Locators corresponding to blocked online locations to judicial redress referred to in paragraph 1 and the users’ right to submit complaints to the Coordinating Authority in accordance with Article 34.
Amendment 576 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities. They shall also not be liable for carrying out, in good faith and in accordance with Article 4, voluntary measures and activities, in particular those aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirementsis Regulation.
Amendment 581 #
Proposal for a regulation
Article 25 – paragraph 5
Article 25 – paragraph 5
5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to efficiently handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and communicate it to the EU Centre. They shall keep that information updated.
Amendment 587 #
Proposal for a regulation
Article 25 – paragraph 8
Article 25 – paragraph 8
8. The EU Centre shall provide such assistance without undue delay, free of charge and in accordance with its tasks and obligations under this Regulation and insofar as its resources and priorities allow.
Amendment 589 #
Proposal for a regulation
Article 26 – paragraph 1
Article 26 – paragraph 1
1. Member States shall ensure that the Coordinating Authorities that they designated perform their tasks under this Regulation in an objective, impartial, transparent and timely manner, while fully respecting the fundamental rights of all parties affected. Member States shall ensure thatprovide their Coordinating Authorities have adequatewith sufficient technical, financial and human resources to efficiently carry out their tasks.
Amendment 598 #
Proposal for a regulation
Article 26 – paragraph 4
Article 26 – paragraph 4
4. The Coordinating Authorities shall ensure that relevant members of staff have the required qualifications, experience, integrity and technical skills to perform their duties.
Amendment 613 #
Proposal for a regulation
Article 32 a (new)
Article 32 a (new)
Article 32 a Public awareness campaigns Coordinating authorities shall in cooperation with the EU Center regularly carry out public awareness campaigns to inform about measures to prevent and combat child sexual abuse online and offline and how to seek child-fiendly and age appropriate reporting and assistance and to inform about victims rights.
Amendment 627 #
Proposal for a regulation
Article 38 – paragraph 2 a (new)
Article 38 – paragraph 2 a (new)
2 a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of online child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to efficiently detect, remove and block content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
Amendment 628 #
Proposal for a regulation
Article 39 – paragraph 1
Article 39 – paragraph 1
1. Coordinating Authorities shall efficiently cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and, other relevant Union agencies, including Europol particular Europol and hotlines, to facilitate the performance of their respective tasks under this Regulation and ensure its effective, efficient and consistent application and enforcement.
Amendment 631 #
Proposal for a regulation
Article 39 – paragraph 2
Article 39 – paragraph 2
2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies, hotlines and providers of relevant information society services.
Amendment 634 #
Proposal for a regulation
Article 39 – paragraph 3
Article 39 – paragraph 3
3. The Coordinating Authorities, the Commission, the EU Centre, other relevant Union agencies, hotlines and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.