BETA

Activities of Lucia ĎURIŠ NICHOLSONOVÁ related to 2022/0155(COD)

Shadow opinions (1)

OPINION on the proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
2023/03/29
Committee: CULT
Dossiers: 2022/0155(COD)
Documents: PDF(215 KB) DOC(140 KB)
Authors: [{'name': 'Niyazi KIZILYÜREK', 'mepid': 197415}]

Amendments (110)

Amendment 42 #
Proposal for a regulation
Recital 17 a (new)
(17 a) Member States continue to struggle with putting in place effective prevention programmes to combat child sexual abuse as required in Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children and child pornography, where frequently multiple types of stakeholders need to take action. As a result, children and the persons in their environment are insufficiently aware of the risks of sexual abuse and of the means of limiting such risks, while the online dimension represents a particular challenge, with constant growing tendency. As education plays a key role in the prevention of child sexual abuse, Member States should inform the public, by all means necessary, about the dangers and risks of sexual abuse for young people in the digital world, including by ensuring a close cooperation at European and international level and by strengthening work with organised civil society, in particular with schools and law enforcement representatives. Member States should take appropriate means to include programmes to this effect in the early education curricula.
2022/11/30
Committee: CULT
Amendment 43 #
Proposal for a regulation
Recital 18 a (new)
(18 a) Basic digital skills, including cyber hygiene, cyber safety, data protection and media literacy are essential for children and young people, as they enable them to make informed decisions, assess and overcome the risks associated with the internet. Therefore, it is important to strengthen media literacy efforts in Member States and at the Union level, through dedicated media literacy education, publicly available relevant materials adapted for different age groups and information campaigns for children and their guardians.
2022/11/30
Committee: CULT
Amendment 55 #
Proposal for a regulation
Recital 49
(49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, removal orders or blocking orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, and reacting timely to the evolving trends of child sexual abuse material dissemination and monetisation, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.
2022/11/30
Committee: CULT
Amendment 56 #
Proposal for a regulation
Recital 50
(50) With a view to ensuring that providers of hosting services are aware of the misuse made of their services and to afford them an opportunity to take expeditious action to remove or disable access on a voluntary basis, Coordinating Authorities of establishment or organisations acting in the public interest against child sexual abuse, such as hotlines, should be able to notify those providers of the presence of known child sexual abuse material on their services and requesting removal or disabling of access thereof, for the providers’ voluntary consideration. Such notifying activities should be clearly distinguished from the Coordinating Authorities’ powers under this Regulation to request the issuance of removal orders, which impose on the provider concerned a binding legal obligation to remove or disable access to the material in question within a set time period.
2022/11/30
Committee: CULT
Amendment 57 #
Proposal for a regulation
Recital 56
(56) With a view to ensuring that the indicators generated by the EU Centre for the purpose of detection are as complete as possible, the submission of relevant material and transcripts should be done proactively by the Coordinating Authorities. However, the EU Centre should also be allowed to bring certain material or conversations to the attention of the Coordinating Authorities for those purposes. and receive reports concerning the trends in the dissemination and monetisation of child sexual abuse material from relevant organisations acting in the public interest against child sexual abuse and other stakeholders.
2022/11/30
Committee: CULT
Amendment 62 #
Proposal for a regulation
Recital 60
(60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of hosting services, the provision of assistance to Coordinating Authorities, as well as the generation and sharing of knowledge, best practices and expertise related to online child sexual abuse, including the evolving trends in the dissemination and monetisation of child sexual abuse material.
2022/11/30
Committee: CULT
Amendment 64 #
Proposal for a regulation
Recital 61
(61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities or when appropriate, by the organisations acting in the public interest against child sexual abuse, in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different child sexual abuse material (new material), or the solicitation of children, as applicable.
2022/11/30
Committee: CULT
Amendment 65 #
Proposal for a regulation
Recital 62
(62) For the system established by this Regulation to function properly, the EU Centre should be charged with creating databases for each of those three types of online child sexual abuse, and with maintaining, timely updating and operating those databases. For accountability purposes and to allow for corrections where needed, it should keep records of the submissions and the process used for the generation of the indicators.
2022/11/30
Committee: CULT
Amendment 69 #
Proposal for a regulation
Recital 67
(67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse, including on the successful initiatives and good practices on the proactive search for online child sexual material, trends in its creation and monetisation, as well as the voluntary prevention, detection and mitigation of online child sexual abuse. In this connection, the EU Centre should cooperate on a regular basis with relevant stakeholders from both within and outside the Union, including law enforcement authorities with the relevant expertise, educators, civil society, service providers and industry representatives, and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned.
2022/11/30
Committee: CULT
Amendment 75 #
Proposal for a regulation
Recital 69
(69) In order to allow for the effective and efficient performance of its tasks, the EU Centre should closely cooperate with Coordinating Authorities, the Europol and relevant partner organisations, such as the US National Centre for Missing and Exploited Children or the International Association of Internet Hotlines (‘INHOPE’) network of hotlines for reporting child sexual abuse material, within the limits sets by this Regulation and other legal instruments regulating their respective activities. To facilitate and support such cooperation and build on the best practices and expertise acquired, the necessary arrangements should be made, including the designation of contact officers by Coordinating Authorities and the conclusion of memoranda of understanding with Europol and, where appropriate, with one or more of the relevant partner organisations located in the Union and outside the Union.
2022/11/30
Committee: CULT
Amendment 77 #
Proposal for a regulation
Recital 70
(70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines and organisations which act in the public interest against child sexual abuse and which proactively search for child sexual abuse material or which do research and gather information on the trends in the dissemination and monetisation of child sexual abuse material, are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and organisations and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union.
2022/11/30
Committee: CULT
Amendment 80 #
Proposal for a regulation
Recital 70 a (new)
(70 a) In line with Directive 2011/93/EU of the European Parliament and of the Council, this Regulation recognises and safeguards the key role of hotlines in order to enhance the fight against child sexual abuse online in the European Union. Hotlines have a track-record of proven capability since 1999 in the identification and removal of child sexual abuse material from the digital environment and have created a worldwide network and procedures for the child sexual abuse identification and removal. Member States should therefore promote and safeguard the role of formally recognized non-governmental organizations involved in anonymous public reporting of child sexual abuse material, which are at the forefront of detecting new child sexual abuse material, which is an essential factor in finding new victims while also keeping the databases of indicators up to date.
2022/11/30
Committee: CULT
Amendment 86 #
Proposal for a regulation
Recital 75
(75) In the interest of transparency and accountability and to enable evaluation and, where necessary, adjustments, providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services, Coordinating Authorities and the EU Centre should be required to collect, record and analyse information, based on anonymised gathering of non-personal data and to publish annual reports on their activities under this Regulation. The Coordinating Authorities should cooperate with Europol and with law enforcement authorities and other relevant national authorities of the Member State that designated the Coordinating Authority in question and when appropriate, with partner organisations, in gathering that information.
2022/11/30
Committee: CULT
Amendment 93 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
(w a) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous information from the public about alleged child sexual abuse material and online child sexual exploitation, which meets all the following criteria: (a) is officially recognised by its home Member State as expressed in the Directive 2011/93/EU of the European Parliament and of the Council; (b) has the mission of combatting child sexual abuse material in its articles of association; and (c) is part of a recognised and well-established international network of hotlines as referred to in this article.
2022/11/30
Committee: CULT
Amendment 95 #
Proposal for a regulation
Chapter I a (new)
Ia PREVENTION AND EDUCATION PROGRAMMES Article 2 a (new) 1. Member States shall take appropriate measures, such as education, awareness raising campaigns and training, to discourage and reduce the demand that fosters all forms of sexual exploitation of children in the online environment. 2. Member States shall take appropriate action, including through the Internet, such as information and awareness- raising campaigns, research and early- education programmes, where appropriate in cooperation with relevant civil society organisations acting in the public interest against child sexual abuse, law enforcement authorities and other stakeholders, aimed at raising awareness and reducing the risk of children becoming victims of sexual abuse or of exploitation online. 3. Member States shall promote regular training for officials likely to come into contact with child victims of sexual abuse or exploitation online, including the solicitation of children, aimed at enabling them to identify and deal with child victims and potential child victims. 4. Member States shall promote regular training for officials to inform them and update their knowledge on the latest trends in the creation, dissemination and monetization of child sexual abuse materials and national data hosting of child sexual abuse material.
2022/11/30
Committee: CULT
Amendment 98 #
Proposal for a regulation
Article 4 – paragraph 1 – point b a (new)
(b a) to provide, through appropriate technical and operational measures, readily accessible and easy-to-use parental tools to help parents or guardians support children and identify harmful behaviour;
2022/11/30
Committee: CULT
Amendment 99 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations, hotlines or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
2022/11/30
Committee: CULT
Amendment 100 #
Proposal for a regulation
Article 6 a (new)
Article 6 a Anonymous public reporting of online child sexual abuse 1. Member States shall take appropriate measures to promote and safeguard the role of formally recognized non- governmental organizations involved in anonymous public reporting of child sexual abuse material and the proactive search for such material. 2. Member States shall ensure that the public always has the possibility to anonymously report child sexual abuse material and child sexual exploitation activities to hotlines specialised in combatting online child sexual abuse material and shall safeguard the role of such hotlines in anonymous public reporting. 3. Member States shall ensure that the hotlines referred to in paragraph 2 operating in their territory are authorised to view, assess and process anonymous reports of child sexual abuse material. 4. Member States shall grant the hotlines referred to in paragraph 2 the authority to issue content removal notices for confirmed instances of child sexual abuse material. 5. Member States shall authorise the hotlines referred to in paragraph 2 to voluntarily conduct pro-active searching for child sexual abuse material online.
2022/11/30
Committee: CULT
Amendment 101 #
Proposal for a regulation
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation or by the report submitted by the recognised hotline, which results in its voluntary and timely removal, of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
2022/11/30
Committee: CULT
Amendment 102 #
Proposal for a regulation
Article 19 – paragraph 1
Providers of relevant information society services, hotlines and organisations acting solely in the public interest against child sexual abuse shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements.
2022/11/30
Committee: CULT
Amendment 107 #
Proposal for a regulation
Article 21 – paragraph 1
1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider complemented in a timely matter and, if requested and appropriate, also included in the list of indicators used to prevent the further dissemination of these items.
2022/11/30
Committee: CULT
Amendment 120 #
Proposal for a regulation
Article 26 – paragraph 2 – point c
(c) are free from any undue external influence, whether direct or indirect; it being understood that the membership of the Coordinating Authority in a recognised international network shall not prejudice its independent character;
2022/11/30
Committee: CULT
Amendment 124 #
Proposal for a regulation
Article 40 – paragraph 2
2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information, educational materials and expertise and facilitate cooperation between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
2022/11/30
Committee: CULT
Amendment 127 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – introductory part
(6) facilitate the generation and sharing of knowledge with other Union institutions, bodies, offices and agencies, organisations acting in the public interest against child sexual abuse and hotlines, Coordinating Authorities or other relevant authorities of the Member States to contribute to the achievement of the objective of this Regulation, by:
2022/11/30
Committee: CULT
Amendment 132 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b
(b) supporting the development and dissemination of research, educational materials and expertise on those matters and on assistance to victims, including by serving as a hub of expertise to support evidence-based policy;
2022/11/30
Committee: CULT
Amendment 137 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 a (new)
(6 a) supporting and promoting the regular exchange of best practices and lessons learned among Member States on raising awareness for the prevention of child sexual abuse, prevention programmes and non-formal and formal education on the risks of sexual abuse in the digital environment;
2022/11/30
Committee: CULT
Amendment 138 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 b (new)
(6 b) provide assistance with training on prevention of child sexual abuse online for officials from Member States;
2022/11/30
Committee: CULT
Amendment 139 #
8 a. Where the EU Centre receives a report from a hotline, or where a provider that submitted the report to the EU Centre has indicated that the report is based on the information received from a hotline, the EU Centre shall refrain from forwarding the report to the competent law enforcement authority or authorities to avoid duplicated reporting on the same material that has already been reported to the national law enforcement by the hotlines, and shall monitor the removal of the child sexual abuse material or cooperate with the relevant hotline to track the status.
2022/11/30
Committee: CULT
Amendment 145 #
Proposal for a regulation
Article 54 – paragraph 1
1. Where necessary for the performance of its tasks under this Regulation, the EU Centre mayshall cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations and semi-public organisations. In particular, the cooperation with the EU Centre referred to in paragraph 1 may include the following: (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(8), Article 4(5), Article 6(4) and Article 11; (b) updating the databases of indicators referred to in Article 44; (c) making technologies available to providers for the execution of detection orders issued to them, in accordance with Article 50(1); or (d) innovation of the detection technologies and education of the service providers and other stakeholders on the effective prevention and mitigation measures through information sharing or collective action.
2022/11/30
Committee: CULT
Amendment 146 #
Proposal for a regulation
Article 54 – paragraph 2
2. The EU Centre may conclude memoranda of understandingstrategic and/or operational cooperation agreements with organisations referred to in paragraph 1, laying down the terms of cooperation.
2022/11/30
Committee: CULT
Amendment 147 #
Proposal for a regulation
Article 83 a (new)
Article 83 a Data collection on prevention programmes Member States shall report on the anticipated number of children in primary education who have been informed through the awareness campaigns and through the education programmes about the risks of all forms of sexual exploitation of children, including in the online environment.
2022/11/30
Committee: CULT
Amendment 328 #
Proposal for a regulation
Recital 14 a (new)
(14a) Given the severity of these crimes and the long-lasting negative consequences on the victims and the risk of revictimization as a result of the dissemination of known material, new material, as well as activities constituting the solicitation of children, it is essential that this Regulation provides specific obligations for providers of hosting service and providers of interpersonal communication services to prevent, detect, report and remove child sexual abuse material in all their services, including interpersonal communications services, which may also be covered by end-to-end encryption, in light of the prevalence of dissemination of child sexual abuse material, including the solicitation of children, in interpersonal communication services.
2023/07/28
Committee: LIBE
Amendment 344 #
Proposal for a regulation
Recital 17
(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect and prevent online child sexual abuse in their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authority.
2023/07/28
Committee: LIBE
Amendment 428 #
Proposal for a regulation
Recital 36
(36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. Providers should create and run an accesible, age-appropriate and user- friendly mechanism allowing users to flag any instances of potential online child sexual abuse on their platform. The providers should also offer reasonable assistance to the users who report these cases, such as implementing visible alert and alarm systems on their platforms, as well as providing links to local organizations such as hotlines, helplines, or victims' rights organizations, to assist potential victims.
2023/07/28
Committee: LIBE
Amendment 433 #
Proposal for a regulation
Recital 44
(44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to the application of this Regulation, without prejudice to the enforcement powers of other national authorities. The Coordinating Authority should oversee the implementation of the Regulation, including issues related to prevention, education and awareness raising, and organise and promote regular trainings for officials, including law enforcement authorities, who deal with cases which involve children.
2023/07/28
Committee: LIBE
Amendment 438 #
Proposal for a regulation
Recital 49
(49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, removal orders or blocking orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, and reacting timely to the evolving trends of child sexual abuse material dissemination and monetisation, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.
2023/07/28
Committee: LIBE
Amendment 451 #
Proposal for a regulation
Recital 57
(57) Certain providers of relevant information society services offer their services in several or even all Member States, whilst under this Regulation only a single Member State has jurisdiction in respect of a given provider. It is therefore imperative that the Coordinating Authority designated by the Member State having jurisdiction takes account of the interests of all users in the Union when performing its tasks and using its powers, without making any distinction depending on elements such as the users’ location or nationality, and that Coordinating Authorities cooperate with each other in an effective and efficient manner. To facilitate such cooperation, the necessary mechanisms and information- sharing systems should be provided for. That cooperation shall be without prejudice to the possibility for Member States to provide for regular exchanges of views with other public authorities where relevant for the performance of the tasks of those other authorities and of the Coordinating Authority and receive reports concerning the trends in the dissemination and monetisation of child sexual abuse material from relevant organisations acting in the public interest against child sexual abuse and other stakeholders, including service providers.
2023/07/28
Committee: LIBE
Amendment 459 #
Proposal for a regulation
Recital 61
(61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities or when appropriate, by the organisations acting in the public interest against child sexual abuse, in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different child sexual abuse material (new material), or the solicitation of children, as applicable.
2023/07/28
Committee: LIBE
Amendment 471 #
Proposal for a regulation
Recital 67
(67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse, including on the successful initiatives and good practices on the proactive search for online child sexual material, trends in its creation and monetisation, as well as the voluntary prevention, detection and mitigation of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned.
2023/07/28
Committee: LIBE
Amendment 477 #
Proposal for a regulation
Recital 70
(70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. This role played by hotlines should be reinforced and they should continue to facilitate this fight. Each Member State should ensure that at least one official hotline is operating in its territory. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Anonymous public reporting is crucial to countering child sexual abuse and hotlines have created a worldwide network and procedures for the child sexual abuse identification and removal. Member States should ensure that the public has the possibility to anonymously report child sexual abuse material and child sexual exploitation activities to hotlines specialised in combatting online child sexual abuse material and shall safeguard the role of such hotlines in anonymous public reporting. The promotion of hotlines by the EU Centre and the Coordinating Authorities through the educational systems of Member States in order to educate youth and reach potential victims is of great importance. The experience of hotlines and other non- governmental organizations involved in reporting or proactive searching of child sexual abuse material and expertise should help the EU Centre and Coordinating Authorities to design appropriate prevention techniques and awareness campaigns and keeping the databases of indicators up to date.
2023/07/28
Committee: LIBE
Amendment 486 #
Proposal for a regulation
Recital 74
(74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection and prevention of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.
2023/07/28
Committee: LIBE
Amendment 487 #
Proposal for a regulation
Recital 74 a (new)
(74a) The Technology Committee could therefore establish a certification for technologies which could be used by online service providers to detect child sexual abuse material on their request.
2023/07/28
Committee: LIBE
Amendment 515 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
(da) obligations on providers of online search engines and any other artificial intelligence systems to delist or disable specific items of child sexual abuse, or both;
2023/07/28
Committee: LIBE
Amendment 555 #
Proposal for a regulation
Article 2 – paragraph 1 – point e a (new)
(ea) “online search engine” means an intermediary service as defined in Article 3, point (j), of Regulation (EU) 2022/2065;
2023/07/28
Committee: LIBE
Amendment 556 #
Proposal for a regulation
Article 2 – paragraph 1 – point e b (new)
(eb) ‘intermediary service’ means a service as defined in Article 3, point (g), of Regulation (EU) 2022/2065;
2023/07/28
Committee: LIBE
Amendment 557 #
Proposal for a regulation
Article 2 – paragraph 1 – point e c (new)
(ec) ‘artificial intelligence system’ (AI system) means software as defined in Article 3(1) of Regulation (EU) .../... on Artificial Intelligence (Artificial Intelligence Act);
2023/07/28
Committee: LIBE
Amendment 569 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
(iva) an online search engine;
2023/07/28
Committee: LIBE
Amendment 570 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv b (new)
(ivb) an artificial intelligence system.
2023/07/28
Committee: LIBE
Amendment 581 #
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 178 years;
2023/07/28
Committee: LIBE
Amendment 593 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) (q a) ‘victim’ means a person residing in the European Union who being under 18 suffered child sexual abuse offences. For the purpose of exercising the victim’s rights recognised in this Regulation, parents and guardians, as well as any person who was under 18 at the time the material was made, whose material has been hosted or disseminated in the European Union, are to be considered victims;
2023/07/28
Committee: LIBE
Amendment 603 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
(wa) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous information from the public about potential child sexual abuse material and online child sexual exploitation, which is officially recognised by its home Member State as expressed in the Directive 2011/93/EU of the European Parliament and of the Council and has the mission of combatting child sexual abuse material in its articles of association;
2023/07/28
Committee: LIBE
Amendment 613 #
Proposal for a regulation
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of use of the service for the purpose of online child sexual abuse., which requires a targeted and tailor-made response;
2023/07/28
Committee: LIBE
Amendment 625 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability of functionalities to address theprevent and address online child sexual abuse and risks referred to in paragraph 1, including through the following:
2023/07/28
Committee: LIBE
Amendment 628 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability of functionalities to prevent and address the risk referred to in paragraph 1, including through the following:
2023/07/28
Committee: LIBE
Amendment 634 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 2 a (new)
- - implementing functionalities and protocols to prevent and reduce the risk of online child sexual abuse; - information and awareness campaigns educating and warning users of the risk of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 646 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag and report online child sexual abuse to the provider through tools that are easily accessible and age-appropriate with timely response;
2023/07/28
Committee: LIBE
Amendment 650 #
- – Functionalities enabling detection for known child sexual abuse material on upload; – Functionalities preventing uploads from the dark web;
2023/07/28
Committee: LIBE
Amendment 660 #
Proposal for a regulation
Article 3 – paragraph 2 – point d
(d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, whether the service is available directly to end users, and the impact thereof on that risk;
2023/07/28
Committee: LIBE
Amendment 664 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point i
(i) the extent to which the service is used or is likely to be used by children, such as an assessment of public surfaces, behavioral signals, the frequency of user reports of online child sexual abuse, and the results of random sampling of content;
2023/07/28
Committee: LIBE
Amendment 693 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
2a. When providers of hosting services and providers of interpersonal communication services put forward age assurance or age verification systems as mitigating measures, they shall meet the following criteria: (a) Protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose; (b) Do not collect data that is not necessary for the purposes of age assurance; (c) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse; (d) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
2023/07/28
Committee: LIBE
Amendment 733 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable and proportionate mitigation measures, tailored to the risk identified pursuant to Article 3 and their service, to minimise that risk. Such measures shall include some or all of the following:
2023/07/28
Committee: LIBE
Amendment 734 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to their specific service and the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
2023/07/28
Committee: LIBE
Amendment 735 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, including the monitoring tools of phrases and indicators on public surfaces, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions, reporting tools that are effective, easily accessible and age appropriate, or the protocols for investigating the reported content and taking appropriate action;
2023/07/28
Committee: LIBE
Amendment 741 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
(aa) Designing educational and awareness-raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, including child-appropriate information;
2023/07/28
Committee: LIBE
Amendment 758 #
Proposal for a regulation
Article 4 – paragraph 1 – point b
(b) reinforcing the provider’s internal processes or the internal supervision of the functioning of the service, user testing and feedback collection;
2023/07/28
Committee: LIBE
Amendment 759 #
Proposal for a regulation
Article 4 – paragraph 1 – point b a (new)
(ba) Implementing and constantly innovating functionalities and protocols to prevent and reduce the risk of online child sexual abuse, and regularly assessing their effectiveness in light of the latest technological developments and trends in the dissemination and monetization of child sexual abuse material;
2023/07/28
Committee: LIBE
Amendment 760 #
Proposal for a regulation
Article 4 – paragraph 1 – point b b (new)
(bb) the use of specific technologies on a voluntary basis for the sole purpose of preventing and detecting online child sexual abuse in accordance with Article 4a
2023/07/28
Committee: LIBE
Amendment 775 #
1a. Providers of hosting services and providers of interpersonal communications services shall continue the voluntary use of specific technologies, as mitigation measures, for the processing of personal and other data to the extent strictly necessary to detect, report and remove online child sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment conducted or updated in accordance with Article 3 and prior authorization from the Coordinating Authority;
2023/07/28
Committee: LIBE
Amendment 779 #
Proposal for a regulation
Article 4 – paragraph 1 b (new)
1b. The Coordinating Authority shall decide whether to proceed according to paragraph 1a no later than three months from the provider’s request.
2023/07/28
Committee: LIBE
Amendment 799 #
Proposal for a regulation
Article 4 – paragraph 2 a (new)
2a. If the risk assessment conducted or updated in accordance with Article 3 identifies that there is a risk of use of the service being used to disseminate, store or make available verified child sexual abuse material, reasonable mitigation measures may include voluntary measures to detect and remove such material in accordance with Article 4, (a).
2023/07/28
Committee: LIBE
Amendment 804 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary and proportionate age verification and age assessment measures to reliably identify childdifferentiate between child users and adult users on their services, enabling them to take the mitigation measures and protect child users. Age assurance or age verification systems as mitigation measure shall be implemented only if they meet the criteria set in Article 3, paragraph 2a of this Regulation.
2023/07/28
Committee: LIBE
Amendment 825 #
Proposal for a regulation
Article 4 a (new)
Article4a Additional requirements for voluntary detection and removal of verified child sexual abuse material 1. Providers of hosting services and providers of interpersonal communications services who take measures under Article 4(2) to voluntary detect and remove child sexual abuse material shall: a) do so in compliance with Regulation (EU) 2016/679 (General Data Protection Regulation) and applicable national law concerning the processing of personal data relating to criminal offences or alleged criminal offences; b) ensure that the processing of personal data is limited to what is strictly necessary for the purpose of prevention, detection and reporting of child sexual abuse online and removal of child sexual abuse material and, unless child sexual abuse online has been detected and confirmed as such, is erased immediately; c) implement internal procedures to ensure that new child sexual abuse material, or solicitation of children, it not reported to relevant authorities without prior human confirmation; d) consider any such processing of content or traffic data commenced after the date of this Regulation shall be considered high risk to the rights and freedoms of natural persons for the purposes of Articles 35 and 36 of Regulation (EU) 2016/679 and complete a prior data protection impact assessment and consult with their relevant supervisory authority. 2. The provider has identified evidence of a significant risk of the service being used for the purposes of online child sexual abuse in the risk assessment conducted or updated in accordance with Article 3, and that is likely, despite any mitigation measures that the provider may have taken or will take, that the service is used, to an appreciable extent for the dissemination of child sexual abuse material. 3. The provider has implemented additional and appropriate technological and operational controls, safeguards and measures aimed at detecting online child sexual abuse and usage of technologies in accordance with Article 10 and with regard to the principle of data protection by design and by default laid down in Article 25 of Regulation (EU) 2016/679. 4. The provider shall draft and submit to the Coordinating Authority and the EU Centre an implementation plan setting out the measures it envisages taking to voluntarily detect child sexual abuse material, including detailed information regarding the envisaged technologies and safeguards and where applicable, attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted in view of the outcome of the data protection impact assessment and of that opinion. 5. The provider shall annually publish and submit to the competent supervisory authority and to the Commission a report on the processing of personal data under this Regulation, including on the type and volumes of data processed, number of cases identified, measures applied to select and improve key indicators, effectiveness of the different technologies deployed, the retention policy and the data protection safeguards applied.
2023/07/28
Committee: LIBE
Amendment 838 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) any mitigation measures taken and those that require prior authorization pursuant to Article 4.
2023/07/28
Committee: LIBE
Amendment 845 #
Proposal for a regulation
Article 5 – paragraph 4 – point a (new)
(a) Where the Coordinating Authority considers that the mitigation measures taken do not comply with Article 4, it shall address a decision to the provider requiring it to take the necessary measures so as to ensure that Article 4 is complied with.
2023/07/28
Committee: LIBE
Amendment 853 #
Proposal for a regulation
Article 6
1. Providers of software application stores shall: (a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children; (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; (c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b). 2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3. 3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures. 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular Article 6 deleted Obligations for software application sto relevant technological developments and to the manners in which the services covered by those provisions are offered and used.s
2023/07/28
Committee: LIBE
Amendment 894 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect and prevent online child sexual abuse on a specific service.
2023/07/28
Committee: LIBE
Amendment 895 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
1a. The Coordinating Authority of establishment shall have the power to authorise the provider the voluntary use of specific technologies for the processing of personal data and other data to the extent strictly necessary to detect, report and remove online child sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, following a risk assessment performed by the provider pursuant to Article 3 of this Regulation. It shall have the power to define the terms of authorisation for the provider to take measures specified in Article 10 to detect online child sexual abuse on a specific service.
2023/07/28
Committee: LIBE
Amendment 935 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point b
(b) where the draft implementation plan concerns an intended detection order concerning new child sexual abuse material and the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct a data protection impact assessment and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation plan;
2023/07/28
Committee: LIBE
Amendment 967 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
(ba) the provider has failed to take all reasonable and proportionate mitigation measures within the meaning of Article 4 to prevent and minimise the risk of the service being used for the purpose of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 1023 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
To that aim, they shall take into account all relevant parameters, including: (i) the availability of sufficiently reliable detection technologies in that they can be deployed without undermining the security of the service in question and they limit to the maximum extent possible the rate of errors regarding the detection and; (ii) their suitability and effectiveness of the available technologies for achieving the objectives of this Regulation, as well as; (iii) the impact of the measures on the rights of the users affected, and require the taking ofthereby ensuring that detection orders are only requested and issued when sufficiently reliable technologies in accordance with point (i) are available and that the least intrusive measures are chosen, in accordance with Article 10, from among several equally effective measures.
2023/07/28
Committee: LIBE
Amendment 1030 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point a
(a) where that risk is limited to an identifiable part or component of a service information gathered in the risk assessment process indicates that that risk is limited to an identifiable part or component of a service where possible without prejudice to the effectiveness of the measure, the required measures are only applied in respect of that part or component;
2023/07/28
Committee: LIBE
Amendment 1035 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 1
The competent judicial authority or independent administrative authority shall specify in the detection order the period during which it applies, indicating the start date and the end date., within which the providers of hosting services and providers of interpersonal communications services shall prove that their service is no longer misused for child sexual abuse and their specific service provided no longer poses a risk for child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 1050 #
Proposal for a regulation
Article 7 a (new)
Article7a Safeguards on encrypted services For the scope of this regulation and for the sole purpose to prevent and combat child sexual abuse, providers of interpersonal communications services shall be subjected to obligations to prevent, detect, report and remove online child sexual abuse on all their services, which may include as well those covered by end-to-end encryption, when there is a significant risk that their specific service is misused for online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment established in Article 3 of this Regulation. The technologies deployed to execute the detection order pursuant to Article 7 of this Regulation shall never prohibit or make encryption impossible and only be deployed after a prior authorization by the Coordinating Authority, in consultation with the competent data protection authority, and be subjected to constant monitoring and auditing by the competent data protection authority to verify their compliance with Union law.
2023/07/28
Committee: LIBE
Amendment 1129 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that have received a detection order or undertake voluntary detection measures in accordance with Article 4a, shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
2023/07/28
Committee: LIBE
Amendment 1137 #
Proposal for a regulation
Article 10 – paragraph 2
2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of using voluntary measures, when authorised, ofr executing thea detection order. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
2023/07/28
Committee: LIBE
Amendment 1138 #
Proposal for a regulation
Article 10 – paragraph 2
2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order or voluntary detection. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
2023/07/28
Committee: LIBE
Amendment 1152 #
Proposal for a regulation
Article 10 – paragraph 3 – point d
(d) sufficiently reliable, in that they limit to the maximum extent possible the rate of errors regarding the detection., of content representing online child sexual abuse and, where such occasional errors occur, their consequences are rectified without delay;
2023/07/28
Committee: LIBE
Amendment 1155 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) the technologies used to detect patterns of possible solicitation of children are limited to the use of relevant key indicators and objectively identified risk factors such as age difference and the likely involvement of a child in the scanned communication, without prejudice to the right to human review.
2023/07/28
Committee: LIBE
Amendment 1162 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) not able to prohibit or make end- to-end encryption impossible.
2023/07/28
Committee: LIBE
Amendment 1171 #
Proposal for a regulation
Article 10 – paragraph 4 – point a
(a) take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly necessary to use voluntary measures, when authorised, or execute the detection orders addressed to them;
2023/07/28
Committee: LIBE
Amendment 1340 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
PersonVictims residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they reside, information and the referral to support regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them.
2023/07/28
Committee: LIBE
Amendment 1357 #
Proposal for a regulation
Article 21 – paragraph 1
1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known or new child sexual abuse material depicting them removed or to have access thereto disabled by the provider complemented in a timely matter and, if possible and appropriate, also included in the list of indicators used to prevent the further dissemination of these items and submitted to the Coordinating Authority in accordance with Article 36.
2023/07/28
Committee: LIBE
Amendment 1361 #
Proposal for a regulation
Article 21 – paragraph 1 a (new)
1a. Each Member State shall ensure the functioning of hotlines, including through funding and capacity building, in order for victims and their families to receive support from the competent authority in a timely manner.
2023/07/28
Committee: LIBE
Amendment 1363 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
PersonVictims residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove or disable access to one or more specific items of known child sexual abuse material depicting them taking into account the vulnerabilities of the person depicted. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them. All professionals likely to come into contact with child victims of sexual abuse online should be adequately trained and able to recognise and address the specific needs of victims.
2023/07/28
Committee: LIBE
Amendment 1391 #
Proposal for a regulation
Article 23 – paragraph 1
1. PAs referred to in Article 12 of the Digital Service Act Regulation, providers of relevant information society services shall establish a single point of contact allowing for direct communication, by electronic means, with the Coordinating Authorities, other competent authorities of the Member States, the Commission and the EU Centre, for the application of this Regulation.
2023/07/28
Committee: LIBE
Amendment 1398 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 3
The Coordinating Authority shall in any event be responsible for ensuring coordination and overseeing the implementation at national level in respect of those matters, including issues related to prevention, education and awareness raising and the organisation of regular training activities for officials, including in law enforcement authorities who deal with cases which involve children, and for contributing to the effective, efficient and consistent application and enforcement of this Regulation throughout the Union.
2023/07/28
Committee: LIBE
Amendment 1580 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) Referring victims to the appropriate national child protection services;
2023/07/28
Committee: LIBE
Amendment 1593 #
(6a) support Member States in designing preventive measures, such as awareness-raising campaigns to combat child sexual abuse, with a specific focus on girls and other prevalent demographics, including by: (a) Acting on behalf of victims in liaising with other relevant authorities of the Member States for reparations and all other victim support programmes; (b) Referring victims to the appropriate child protection services, and to pro bono legal support services; (c) Facilitating access to care qualified health support services, including mental health and psychological support;
2023/07/28
Committee: LIBE
Amendment 1634 #
Proposal for a regulation
Article 46 – paragraph 2
2. The EU Centre shall give providers of hosting services, providers of interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to put in place voluntary measures, when authorised, and execute the detection or blocking orders that they received in accordance with Articles 7 or 16. It shall take measures to ensure that such access remains limited to what is strictly necessary for the period of application of the detection or blocking orders concerned as well as for the execution of the voluntary measures, when authorised, and that such access does not in any way endanger the proper operation of those databases and the accuracy and security of the data contained therein.
2023/07/28
Committee: LIBE
Amendment 1667 #
Proposal for a regulation
Article 48 – paragraph 1 a (new)
1a. Where the EU Centre receives a report from a Hotline, or from a provider who indicated that the report is based on the information received from a Hotline, the EU Centre shall monitor the removal of child sexual abuse material or cooperate with the Hotline to track its status to avoid duplicated reporting on the same material that has already been reported to the national law enforcement authorities.
2023/07/28
Committee: LIBE
Amendment 1711 #
Proposal for a regulation
Article 50 – paragraph 2 – point c
(c) information resulting from research or other activities conducted by Member States’ authorities, other Union institutions, bodies, offices and agencies, the competent authorities of third countries, international organisations, research centres, hotlines and civil society organisations.
2023/07/28
Committee: LIBE
Amendment 1755 #
Proposal for a regulation
Article 54 – paragraph 1
1. Where necessary for the performance of its tasks under this Regulation, the EU Centre may cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations acting in the public interest, hotlines and semi-public organisations.
2023/07/28
Committee: LIBE
Amendment 1758 #
Proposal for a regulation
Article 54 – paragraph 1 a (new)
1a. In particular, the cooperation with the EU Centre referred to in paragraph 1 may include the following: (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(8), Article 4(5), Article 6(4) and Article 11; (b) updating the databases of indicators referred to in Article 44; (c) innovating new and existing detection technologies; (d) making technologies available to providers for the execution of detection orders issued to them, in accordance with Article 50(1).
2023/07/28
Committee: LIBE
Amendment 1761 #
Proposal for a regulation
Article 54 – paragraph 2 a (new)
2a. The EU Centre shall cooperate with other organisations and bodies carrying out similar functions in other jurisdictions, such as the National Centre for Missing and Exploited Children (‘NCMEC’) and the Canadian Centre for Child Protection, among others, which serve the same purpose of this Regulation, as well as in order to avoid potential duplication of reporting obligations for providers.
2023/07/28
Committee: LIBE
Amendment 1778 #
Proposal for a regulation
Article 57 – paragraph 1 – point f
(f) appoint the members of the Technology Committee, of the Children's Rights and Survivors Advisory Board and of any other advisory group it may establish;
2023/07/28
Committee: LIBE
Amendment 1804 #
Proposal for a regulation
Article 66 – paragraph 6 a (new)
6a. (d) evaluate the effectiveness of new and existing detection technology through unknown datasets of verified indicators. (e) establish best practices on safety by design and the voluntary use of technologies, including prevention and detection technologies, as part of providers’ mitigation measures. (f) introduce a regular reviewing and reporting process to assess and share expertise on the most recent technological innovations and developments related to detection technology.
2023/07/28
Committee: LIBE
Amendment 1829 #
Proposal for a regulation
Article 83 – paragraph 1 – point e a (new)
(ea) Educational and awareness- raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, where possible, including the impact, outreach and effectiveness of the activities carried out on the targeted audience, disaggregated into different categories based on demographics
2023/07/28
Committee: LIBE
Amendment 1830 #
Proposal for a regulation
Article 83 – paragraph 1 – point e b (new)
(eb) Measures put in place by the providers to prevent online child sexual abuse, such as technological systems and processes, where possible, including the impact, outreach and effectiveness of the activities carried out on the targeted audience.
2023/07/28
Committee: LIBE
Amendment 1850 #
Proposal for a regulation
Article 83 – paragraph 2 – point i a (new)
(ia) the measures taken regarding prevention and victim assistance programmes, including the number of children in primary education who are taking part in awareness raising campaigns and through education programmes about the risks of all forms of sexual exploitation of children, including in the online environment.
2023/07/28
Committee: LIBE
Amendment 1871 #
Proposal for a regulation
Article 83 – paragraph 3 – point j a (new)
(ja) the measures taken by Member States regarding prevention, awareness raising, and victim assistance programmes, including the impact, outreach and effectiveness of the activities carried out on the targeted audience, where possible, disaggregated into different categories based on demographics and including best practices and lessons learned of prevention programmes.
2023/07/28
Committee: LIBE