BETA

78 Amendments of Fabienne KELLER related to 2022/0155(COD)

Amendment 328 #
Proposal for a regulation
Recital 14 a (new)
(14a) Given the severity of these crimes and the long-lasting negative consequences on the victims and the risk of revictimization as a result of the dissemination of known material, new material, as well as activities constituting the solicitation of children, it is essential that this Regulation provides specific obligations for providers of hosting service and providers of interpersonal communication services to prevent, detect, report and remove child sexual abuse material in all their services, including interpersonal communications services, which may also be covered by end-to-end encryption, in light of the prevalence of dissemination of child sexual abuse material, including the solicitation of children, in interpersonal communication services.
2023/07/28
Committee: LIBE
Amendment 344 #
Proposal for a regulation
Recital 17
(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect and prevent online child sexual abuse in their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authority.
2023/07/28
Committee: LIBE
Amendment 400 #
Proposal for a regulation
Recital 27 a (new)
(27a) Due to the nature of child sexual abuse materials, the sharing of those contents does not stop at border. The competent authorities and the EU Centre should therefore have a cooperation procedure with the American NCMEC (The National Center for Missing and Exploited Children) to detect and remove those contents more effectively.
2023/07/28
Committee: LIBE
Amendment 428 #
Proposal for a regulation
Recital 36
(36) Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. Providers should create and run an accesible, age-appropriate and user- friendly mechanism allowing users to flag any instances of potential online child sexual abuse on their platform. The providers should also offer reasonable assistance to the users who report these cases, such as implementing visible alert and alarm systems on their platforms, as well as providing links to local organizations such as hotlines, helplines, or victims' rights organizations, to assist potential victims.
2023/07/28
Committee: LIBE
Amendment 438 #
Proposal for a regulation
Recital 49
(49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of detection orders, removal orders or blocking orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, and reacting timely to the evolving trends of child sexual abuse material dissemination and monetisation, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.
2023/07/28
Committee: LIBE
Amendment 451 #
Proposal for a regulation
Recital 57
(57) Certain providers of relevant information society services offer their services in several or even all Member States, whilst under this Regulation only a single Member State has jurisdiction in respect of a given provider. It is therefore imperative that the Coordinating Authority designated by the Member State having jurisdiction takes account of the interests of all users in the Union when performing its tasks and using its powers, without making any distinction depending on elements such as the users’ location or nationality, and that Coordinating Authorities cooperate with each other in an effective and efficient manner. To facilitate such cooperation, the necessary mechanisms and information- sharing systems should be provided for. That cooperation shall be without prejudice to the possibility for Member States to provide for regular exchanges of views with other public authorities where relevant for the performance of the tasks of those other authorities and of the Coordinating Authority and receive reports concerning the trends in the dissemination and monetisation of child sexual abuse material from relevant organisations acting in the public interest against child sexual abuse and other stakeholders, including service providers.
2023/07/28
Committee: LIBE
Amendment 477 #
Proposal for a regulation
Recital 70
(70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. This role played by hotlines should be reinforced and they should continue to facilitate this fight. Each Member State should ensure that at least one official hotline is operating in its territory. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Anonymous public reporting is crucial to countering child sexual abuse and hotlines have created a worldwide network and procedures for the child sexual abuse identification and removal. Member States should ensure that the public has the possibility to anonymously report child sexual abuse material and child sexual exploitation activities to hotlines specialised in combatting online child sexual abuse material and shall safeguard the role of such hotlines in anonymous public reporting. The promotion of hotlines by the EU Centre and the Coordinating Authorities through the educational systems of Member States in order to educate youth and reach potential victims is of great importance. The experience of hotlines and other non- governmental organizations involved in reporting or proactive searching of child sexual abuse material and expertise should help the EU Centre and Coordinating Authorities to design appropriate prevention techniques and awareness campaigns and keeping the databases of indicators up to date.
2023/07/28
Committee: LIBE
Amendment 481 #
Proposal for a regulation
Recital 72
(72) Considering the need for the EU Centre to cooperate intensively with Europol, the EU Centre’s headquarters should be located alongside Europol’s, which is locatThe EU Centre’s headquarters should be decided ion The Hague, the Netherlands. The highly sensitive nature of the reports shared with Europol by the EU Centre and the technical requirements, such as on secure data connections, both benefit from a shared location between the EU Centre and Europol. It would also allow the EU Centre, while being an independent entity, to rely on the support services of Europol, notably those regarding human resources management, information technology (IT), including cybersecurity, the building and communications. Sharing such support services is more cost-efficient and ensure a more professional service than duplicating them by creating them anewthe basis of objective criterias between the Parliament and the Council in an ordinary legislative procedure.
2023/07/28
Committee: LIBE
Amendment 485 #
Proposal for a regulation
Recital 74
(74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology, and the evolution of those technologies and developping new ones.
2023/07/28
Committee: LIBE
Amendment 486 #
Proposal for a regulation
Recital 74
(74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection and prevention of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.
2023/07/28
Committee: LIBE
Amendment 487 #
Proposal for a regulation
Recital 74 a (new)
(74a) The Technology Committee could therefore establish a certification for technologies which could be used by online service providers to detect child sexual abuse material on their request.
2023/07/28
Committee: LIBE
Amendment 515 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
(da) obligations on providers of online search engines and any other artificial intelligence systems to delist or disable specific items of child sexual abuse, or both;
2023/07/28
Committee: LIBE
Amendment 555 #
Proposal for a regulation
Article 2 – paragraph 1 – point e a (new)
(ea) “online search engine” means an intermediary service as defined in Article 3, point (j), of Regulation (EU) 2022/2065;
2023/07/28
Committee: LIBE
Amendment 556 #
Proposal for a regulation
Article 2 – paragraph 1 – point e b (new)
(eb) ‘intermediary service’ means a service as defined in Article 3, point (g), of Regulation (EU) 2022/2065;
2023/07/28
Committee: LIBE
Amendment 557 #
Proposal for a regulation
Article 2 – paragraph 1 – point e c (new)
(ec) ‘artificial intelligence system’ (AI system) means software as defined in Article 3(1) of Regulation (EU) .../... on Artificial Intelligence (Artificial Intelligence Act);
2023/07/28
Committee: LIBE
Amendment 569 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
(iva) an online search engine;
2023/07/28
Committee: LIBE
Amendment 570 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv b (new)
(ivb) an artificial intelligence system.
2023/07/28
Committee: LIBE
Amendment 581 #
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 178 years;
2023/07/28
Committee: LIBE
Amendment 582 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 178 years;
2023/07/28
Committee: LIBE
Amendment 593 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) (q a) ‘victim’ means a person residing in the European Union who being under 18 suffered child sexual abuse offences. For the purpose of exercising the victim’s rights recognised in this Regulation, parents and guardians, as well as any person who was under 18 at the time the material was made, whose material has been hosted or disseminated in the European Union, are to be considered victims;
2023/07/28
Committee: LIBE
Amendment 603 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
(wa) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous information from the public about potential child sexual abuse material and online child sexual exploitation, which is officially recognised by its home Member State as expressed in the Directive 2011/93/EU of the European Parliament and of the Council and has the mission of combatting child sexual abuse material in its articles of association;
2023/07/28
Committee: LIBE
Amendment 613 #
Proposal for a regulation
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of use of the service for the purpose of online child sexual abuse., which requires a targeted and tailor-made response;
2023/07/28
Committee: LIBE
Amendment 625 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability of functionalities to address theprevent and address online child sexual abuse and risks referred to in paragraph 1, including through the following:
2023/07/28
Committee: LIBE
Amendment 627 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability of functionalities to prevent and address the risk referred to in paragraph 1, including through the following:
2023/07/28
Committee: LIBE
Amendment 628 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability of functionalities to prevent and address the risk referred to in paragraph 1, including through the following:
2023/07/28
Committee: LIBE
Amendment 634 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 2 a (new)
- - implementing functionalities and protocols to prevent and reduce the risk of online child sexual abuse; - information and awareness campaigns educating and warning users of the risk of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 646 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag and report online child sexual abuse to the provider through tools that are easily accessible and age-appropriate with timely response;
2023/07/28
Committee: LIBE
Amendment 650 #
- – Functionalities enabling detection for known child sexual abuse material on upload; – Functionalities preventing uploads from the dark web;
2023/07/28
Committee: LIBE
Amendment 660 #
Proposal for a regulation
Article 3 – paragraph 2 – point d
(d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, whether the service is available directly to end users, and the impact thereof on that risk;
2023/07/28
Committee: LIBE
Amendment 664 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point i
(i) the extent to which the service is used or is likely to be used by children, such as an assessment of public surfaces, behavioral signals, the frequency of user reports of online child sexual abuse, and the results of random sampling of content;
2023/07/28
Committee: LIBE
Amendment 688 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 3 a (new)
- – Enabling users to create usernames that contain a representation about, or imply, the user’s age; – Enabling child users to create usernames that contain location information on child users; – Enabling users to know or infer the location of child users.
2023/07/28
Committee: LIBE
Amendment 693 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
2a. When providers of hosting services and providers of interpersonal communication services put forward age assurance or age verification systems as mitigating measures, they shall meet the following criteria: (a) Protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose; (b) Do not collect data that is not necessary for the purposes of age assurance; (c) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse; (d) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
2023/07/28
Committee: LIBE
Amendment 733 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable and proportionate mitigation measures, tailored to the risk identified pursuant to Article 3 and their service, to minimise that risk. Such measures shall include some or all of the following:
2023/07/28
Committee: LIBE
Amendment 734 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to their specific service and the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
2023/07/28
Committee: LIBE
Amendment 735 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, including the monitoring tools of phrases and indicators on public surfaces, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions, reporting tools that are effective, easily accessible and age appropriate, or the protocols for investigating the reported content and taking appropriate action;
2023/07/28
Committee: LIBE
Amendment 741 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
(aa) Designing educational and awareness-raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, including child-appropriate information;
2023/07/28
Committee: LIBE
Amendment 758 #
Proposal for a regulation
Article 4 – paragraph 1 – point b
(b) reinforcing the provider’s internal processes or the internal supervision of the functioning of the service, user testing and feedback collection;
2023/07/28
Committee: LIBE
Amendment 759 #
Proposal for a regulation
Article 4 – paragraph 1 – point b a (new)
(ba) Implementing and constantly innovating functionalities and protocols to prevent and reduce the risk of online child sexual abuse, and regularly assessing their effectiveness in light of the latest technological developments and trends in the dissemination and monetization of child sexual abuse material;
2023/07/28
Committee: LIBE
Amendment 775 #
1a. Providers of hosting services and providers of interpersonal communications services shall continue the voluntary use of specific technologies, as mitigation measures, for the processing of personal and other data to the extent strictly necessary to detect, report and remove online child sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment conducted or updated in accordance with Article 3 and prior authorization from the Coordinating Authority;
2023/07/28
Committee: LIBE
Amendment 804 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary and proportionate age verification and age assessment measures to reliably identify childdifferentiate between child users and adult users on their services, enabling them to take the mitigation measures and protect child users. Age assurance or age verification systems as mitigation measure shall be implemented only if they meet the criteria set in Article 3, paragraph 2a of this Regulation.
2023/07/28
Committee: LIBE
Amendment 808 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verificationnd proportionate age assurance and age assessment measures to reliably identify childdifferentiate between child users and adult users on their services, enabling them to take the mitigation measures and protect child users.
2023/07/28
Committee: LIBE
Amendment 838 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) any mitigation measures taken and those that require prior authorization pursuant to Article 4.
2023/07/28
Committee: LIBE
Amendment 845 #
Proposal for a regulation
Article 5 – paragraph 4 – point a (new)
(a) Where the Coordinating Authority considers that the mitigation measures taken do not comply with Article 4, it shall address a decision to the provider requiring it to take the necessary measures so as to ensure that Article 4 is complied with.
2023/07/28
Committee: LIBE
Amendment 853 #
Proposal for a regulation
Article 6
1. Providers of software application stores shall: (a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children; (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; (c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b). 2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3. 3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures. 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular Article 6 deleted Obligations for software application sto relevant technological developments and to the manners in which the services covered by those provisions are offered and used.s
2023/07/28
Committee: LIBE
Amendment 894 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect and prevent online child sexual abuse on a specific service.
2023/07/28
Committee: LIBE
Amendment 935 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point b
(b) where the draft implementation plan concerns an intended detection order concerning new child sexual abuse material and the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct a data protection impact assessment and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation plan;
2023/07/28
Committee: LIBE
Amendment 967 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
(ba) the provider has failed to take all reasonable and proportionate mitigation measures within the meaning of Article 4 to prevent and minimise the risk of the service being used for the purpose of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 1023 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
To that aim, they shall take into account all relevant parameters, including: (i) the availability of sufficiently reliable detection technologies in that they can be deployed without undermining the security of the service in question and they limit to the maximum extent possible the rate of errors regarding the detection and; (ii) their suitability and effectiveness of the available technologies for achieving the objectives of this Regulation, as well as; (iii) the impact of the measures on the rights of the users affected, and require the taking ofthereby ensuring that detection orders are only requested and issued when sufficiently reliable technologies in accordance with point (i) are available and that the least intrusive measures are chosen, in accordance with Article 10, from among several equally effective measures.
2023/07/28
Committee: LIBE
Amendment 1030 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point a
(a) where that risk is limited to an identifiable part or component of a service information gathered in the risk assessment process indicates that that risk is limited to an identifiable part or component of a service where possible without prejudice to the effectiveness of the measure, the required measures are only applied in respect of that part or component;
2023/07/28
Committee: LIBE
Amendment 1035 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 1
The competent judicial authority or independent administrative authority shall specify in the detection order the period during which it applies, indicating the start date and the end date., within which the providers of hosting services and providers of interpersonal communications services shall prove that their service is no longer misused for child sexual abuse and their specific service provided no longer poses a risk for child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 1050 #
Proposal for a regulation
Article 7 a (new)
Article7a Safeguards on encrypted services For the scope of this regulation and for the sole purpose to prevent and combat child sexual abuse, providers of interpersonal communications services shall be subjected to obligations to prevent, detect, report and remove online child sexual abuse on all their services, which may include as well those covered by end-to-end encryption, when there is a significant risk that their specific service is misused for online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment established in Article 3 of this Regulation. The technologies deployed to execute the detection order pursuant to Article 7 of this Regulation shall never prohibit or make encryption impossible and only be deployed after a prior authorization by the Coordinating Authority, in consultation with the competent data protection authority, and be subjected to constant monitoring and auditing by the competent data protection authority to verify their compliance with Union law.
2023/07/28
Committee: LIBE
Amendment 1137 #
Proposal for a regulation
Article 10 – paragraph 2
2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of using voluntary measures, when authorised, ofr executing thea detection order. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
2023/07/28
Committee: LIBE
Amendment 1152 #
Proposal for a regulation
Article 10 – paragraph 3 – point d
(d) sufficiently reliable, in that they limit to the maximum extent possible the rate of errors regarding the detection., of content representing online child sexual abuse and, where such occasional errors occur, their consequences are rectified without delay;
2023/07/28
Committee: LIBE
Amendment 1155 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) the technologies used to detect patterns of possible solicitation of children are limited to the use of relevant key indicators and objectively identified risk factors such as age difference and the likely involvement of a child in the scanned communication, without prejudice to the right to human review.
2023/07/28
Committee: LIBE
Amendment 1162 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) not able to prohibit or make end- to-end encryption impossible.
2023/07/28
Committee: LIBE
Amendment 1171 #
Proposal for a regulation
Article 10 – paragraph 4 – point a
(a) take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly necessary to use voluntary measures, when authorised, or execute the detection orders addressed to them;
2023/07/28
Committee: LIBE
Amendment 1340 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
PersonVictims residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they reside, information and the referral to support regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them.
2023/07/28
Committee: LIBE
Amendment 1357 #
Proposal for a regulation
Article 21 – paragraph 1
1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known or new child sexual abuse material depicting them removed or to have access thereto disabled by the provider complemented in a timely matter and, if possible and appropriate, also included in the list of indicators used to prevent the further dissemination of these items and submitted to the Coordinating Authority in accordance with Article 36.
2023/07/28
Committee: LIBE
Amendment 1361 #
Proposal for a regulation
Article 21 – paragraph 1 a (new)
1a. Each Member State shall ensure the functioning of hotlines, including through funding and capacity building, in order for victims and their families to receive support from the competent authority in a timely manner.
2023/07/28
Committee: LIBE
Amendment 1363 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
PersonVictims residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove or disable access to one or more specific items of known child sexual abuse material depicting them taking into account the vulnerabilities of the person depicted. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them. All professionals likely to come into contact with child victims of sexual abuse online should be adequately trained and able to recognise and address the specific needs of victims.
2023/07/28
Committee: LIBE
Amendment 1391 #
Proposal for a regulation
Article 23 – paragraph 1
1. PAs referred to in Article 12 of the Digital Service Act Regulation, providers of relevant information society services shall establish a single point of contact allowing for direct communication, by electronic means, with the Coordinating Authorities, other competent authorities of the Member States, the Commission and the EU Centre, for the application of this Regulation.
2023/07/28
Committee: LIBE
Amendment 1515 #
Proposal for a regulation
Article 39 – paragraph 1
1. Coordinating Authorities shall cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and other relevant Union agencies, including Europol, the European Union Agency for Cybersecurity (ENISA), and other organisations such as NCMEC to facilitate the performance of their respective tasks under this Regulation and ensure its effective, efficient and consistent application and enforcement.
2023/07/28
Committee: LIBE
Amendment 1541 #
Proposal for a regulation
Article 42 – paragraph 1
The choice of the location of the seat of the EU Centre shall be The Hague, The Netherlandmade in accordance with the ordinary legislative procedure, based on the following criteria: (a) it shall not affect the EU Centre’s execution of its tasks or the organisation of its governance structure; (b) it shall ensure that the EU Centre is able to recruit the high-qualified and specialised staff it requires to perform the tasks provided by this Regulation; (c) it shall ensure that it can be set up on site upon the entry into force of this Regulation; (d) it shall ensure appropriate accessibility of the location, the existence of adequate education facilities for the children of staff members, appropriate access to the labour market, social security and medical care for both children and spouses; (e) it shall enable close cooperation with EU institutions, bodies and agencies; (f) it shall ensure sustainability and digital security and connectivity with regards to physical and IT infrastructure and working conditions.
2023/07/28
Committee: LIBE
Amendment 1580 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) Referring victims to the appropriate national child protection services;
2023/07/28
Committee: LIBE
Amendment 1593 #
(6a) support Member States in designing preventive measures, such as awareness-raising campaigns to combat child sexual abuse, with a specific focus on girls and other prevalent demographics, including by: (a) Acting on behalf of victims in liaising with other relevant authorities of the Member States for reparations and all other victim support programmes; (b) Referring victims to the appropriate child protection services, and to pro bono legal support services; (c) Facilitating access to care qualified health support services, including mental health and psychological support;
2023/07/28
Committee: LIBE
Amendment 1634 #
Proposal for a regulation
Article 46 – paragraph 2
2. The EU Centre shall give providers of hosting services, providers of interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to put in place voluntary measures, when authorised, and execute the detection or blocking orders that they received in accordance with Articles 7 or 16. It shall take measures to ensure that such access remains limited to what is strictly necessary for the period of application of the detection or blocking orders concerned as well as for the execution of the voluntary measures, when authorised, and that such access does not in any way endanger the proper operation of those databases and the accuracy and security of the data contained therein.
2023/07/28
Committee: LIBE
Amendment 1667 #
Proposal for a regulation
Article 48 – paragraph 1 a (new)
1a. Where the EU Centre receives a report from a Hotline, or from a provider who indicated that the report is based on the information received from a Hotline, the EU Centre shall monitor the removal of child sexual abuse material or cooperate with the Hotline to track its status to avoid duplicated reporting on the same material that has already been reported to the national law enforcement authorities.
2023/07/28
Committee: LIBE
Amendment 1711 #
Proposal for a regulation
Article 50 – paragraph 2 – point c
(c) information resulting from research or other activities conducted by Member States’ authorities, other Union institutions, bodies, offices and agencies, the competent authorities of third countries, international organisations, research centres, hotlines and civil society organisations.
2023/07/28
Committee: LIBE
Amendment 1739 #
Proposal for a regulation
Article 53 – paragraph 1 a (new)
1a. Europol and the EU Centre shall cooperate with the NCMEC center in the fight against child sexual abuse material. This cooperation may consist of sharing their databases of known child sexual abuse materials.
2023/07/28
Committee: LIBE
Amendment 1755 #
Proposal for a regulation
Article 54 – paragraph 1
1. Where necessary for the performance of its tasks under this Regulation, the EU Centre may cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations acting in the public interest, hotlines and semi-public organisations.
2023/07/28
Committee: LIBE
Amendment 1758 #
Proposal for a regulation
Article 54 – paragraph 1 a (new)
1a. In particular, the cooperation with the EU Centre referred to in paragraph 1 may include the following: (a) supporting the Commission in the preparation of the guidelines referred to in Article 3(8), Article 4(5), Article 6(4) and Article 11; (b) updating the databases of indicators referred to in Article 44; (c) innovating new and existing detection technologies; (d) making technologies available to providers for the execution of detection orders issued to them, in accordance with Article 50(1).
2023/07/28
Committee: LIBE
Amendment 1761 #
Proposal for a regulation
Article 54 – paragraph 2 a (new)
2a. The EU Centre shall cooperate with other organisations and bodies carrying out similar functions in other jurisdictions, such as the National Centre for Missing and Exploited Children (‘NCMEC’) and the Canadian Centre for Child Protection, among others, which serve the same purpose of this Regulation, as well as in order to avoid potential duplication of reporting obligations for providers.
2023/07/28
Committee: LIBE
Amendment 1778 #
Proposal for a regulation
Article 57 – paragraph 1 – point f
(f) appoint the members of the Technology Committee, of the Children's Rights and Survivors Advisory Board and of any other advisory group it may establish;
2023/07/28
Committee: LIBE
Amendment 1804 #
Proposal for a regulation
Article 66 – paragraph 6 a (new)
6a. (d) evaluate the effectiveness of new and existing detection technology through unknown datasets of verified indicators. (e) establish best practices on safety by design and the voluntary use of technologies, including prevention and detection technologies, as part of providers’ mitigation measures. (f) introduce a regular reviewing and reporting process to assess and share expertise on the most recent technological innovations and developments related to detection technology.
2023/07/28
Committee: LIBE
Amendment 1829 #
Proposal for a regulation
Article 83 – paragraph 1 – point e a (new)
(ea) Educational and awareness- raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, where possible, including the impact, outreach and effectiveness of the activities carried out on the targeted audience, disaggregated into different categories based on demographics
2023/07/28
Committee: LIBE
Amendment 1830 #
Proposal for a regulation
Article 83 – paragraph 1 – point e b (new)
(eb) Measures put in place by the providers to prevent online child sexual abuse, such as technological systems and processes, where possible, including the impact, outreach and effectiveness of the activities carried out on the targeted audience.
2023/07/28
Committee: LIBE
Amendment 1850 #
Proposal for a regulation
Article 83 – paragraph 2 – point i a (new)
(ia) the measures taken regarding prevention and victim assistance programmes, including the number of children in primary education who are taking part in awareness raising campaigns and through education programmes about the risks of all forms of sexual exploitation of children, including in the online environment.
2023/07/28
Committee: LIBE
Amendment 1871 #
Proposal for a regulation
Article 83 – paragraph 3 – point j a (new)
(ja) the measures taken by Member States regarding prevention, awareness raising, and victim assistance programmes, including the impact, outreach and effectiveness of the activities carried out on the targeted audience, where possible, disaggregated into different categories based on demographics and including best practices and lessons learned of prevention programmes.
2023/07/28
Committee: LIBE