BETA

Activities of Karen MELCHIOR related to 2022/0155(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
2023/06/28
Committee: FEMM
Dossiers: 2022/0155(COD)
Documents: PDF(331 KB) DOC(193 KB)
Authors: [{'name': 'Heléne FRITZON', 'mepid': 197391}]

Amendments (281)

Amendment 46 #
Proposal for a regulation
Citation 6 a (new)
Having regard to the complimentary impact assessment37aof the European Parliament, _________________ 37a PE 740.248 https://www.europarl.europa.eu/RegData/ etudes/STUD/2023/740248/EPRS_STU(2 023)740248_EN.pdf
2023/05/08
Committee: FEMM
Amendment 48 #
Proposal for a regulation
Recital 1
(1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. While the proportion of child sexual abuse material that affects boys is growing, Child sexual abuse has a disproportionate impact on girls as the vast majority of child sexual abuse material is depicting girls, and girls are overrepresented in cases of solicitation of children, while men are overrepresented as perpetrators. According to reports, 96% of child sexual abuse material is estimated in 2021 to have affected girls. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), in line with the United Nations Convention on the Rights of the Child (UNCRC), which has been ratified by all Member States, and to protect society at large. Users of such services offered in the Union should be able to trust that the services concerned can be used safely, especially by children.
2023/05/08
Committee: FEMM
Amendment 53 #
Proposal for a regulation
Recital 1 a (new)
(1a) A growing number of teenagers are sharing intimate images, despite this being prohibited in a majority of member states. The implementation of measures to detect new abuse material would inevitably flag all such images as abuse material, resulting in a large number of false positives, but also in the investigation of those teenagers. This would significantly infringe on children’s right to privacy, as guaranteed by the Charter of Fundamental Rights of the European Union (‘Charter’), and in line with the United Nations Convention on the Rights of the Child (UNCRC), which has been ratified by all Member States. It would also result in stigma that disproportionately affects girls37b. Therefore services should warn children about the risks of sharing images, and give them guidance on what to do if they do so and something goes wrong. _________________ 37b The outcomes of sexting for children and adolescents: A systematic review of the literature https://doi.org/10.1016/j.adolescence.2021 .08.009
2023/05/08
Committee: FEMM
Amendment 56 #
Proposal for a regulation
Recital 1 b (new)
(1b) Often, teenagers are manipulated into sharing images, or consensually share images which are later shared without their consent. This proposal should provide teenagers with tools to help prevent images from being shared without their consent, in particular through the possibility to submit the image to an EU "take it down" service, which prevents the image from being uploaded to social media websites.
2023/05/08
Committee: FEMM
Amendment 57 #
Proposal for a regulation
Recital 1 c (new)
(1c) The use of software to detect solicitation of children is insufficiently accurate, which means it could result in false positives, or could inadvertently flag child-to-child communications. This poses significant risks, in particular to LGBTQI+ children in hostile households.
2023/05/08
Committee: FEMM
Amendment 58 #
Proposal for a regulation
Recital 1 d (new)
(1d) The combined risks of attempting to detect unknown abuse material and solicitation pose a significant risk to children, and these technologies are also vulnerable to being bypassed by abusers, rendering them ineffective, therefore this legislation should focus on detecting known content, as well as flagging potential solicitation to the child user in an age-appropriate manner, and reducing creation and sharing of self-generated material.
2023/05/08
Committee: FEMM
Amendment 68 #
Proposal for a regulation
Recital 4 a (new)
(4a) The existence of Child Sexual Abuse Material implies that child sexual abuse has already taken place. Detecting abuse material is important, but prevention is also vital. Therefore, member states should significantly strengthen educational measures to help children, teachers, and social services, to identify and report abuse, in particular by teaching children about consent from the earliest age possible, albeit in an age- appropriate manner.
2023/05/08
Committee: FEMM
Amendment 70 #
Proposal for a regulation
Recital 4 b (new)
(4b) Many of the online risks associated with child abuse continue to pose a threat to adults, and many adults have already fallen victim, therefore this regulation should also focus on prevention of online risks, mandating the integration into applications of features that help children learn about, identify and avoid risks, making use of a "learning through doing" approach.
2023/05/08
Committee: FEMM
Amendment 72 #
Proposal for a regulation
Recital 4 c (new)
(4c) The internet is an empowering and beneficial resource for children, allowing them to socialise, learn and play, however it can also pose significant risks. Many online services have set limits on the features accessible to children in order to mitigate these risks, however often depriving children of these features encourages them to lie about their age, or to try to evade age-verification systems. Therefore, rather than prohibiting access, services should focus on adapting their features and implementing safeguards for children.
2023/05/08
Committee: FEMM
Amendment 74 #
Proposal for a regulation
Recital 4 d (new)
(4d) Developers should focus on responsibility by design, with the goal of preventing abuse, developing risk- mitigation and safety features for applications. To achieve this, it is important that developers understand how children use their services, and the threats they face. Therefore, children should be involved in the development process of risk-mitigation and safety features that are built for them.
2023/05/08
Committee: FEMM
Amendment 78 #
Proposal for a regulation
Recital 9 a (new)
(9a) Case law of the European Court of Justice43ahas repeatedly found indiscriminate monitoring of Communications is incompatible with the Charter of Fundamental Rights of the European Union, therefore detection orders should be targeted to individuals or groups suspected of child sexual abuse, and not at the wider population. _________________ 43a Cases C-511/18, C-512/18, C-520/18, and C-623/17 Court of Justice of the European Union
2023/05/08
Committee: FEMM
Amendment 79 #
Proposal for a regulation
Recital 13
(13) The term ‘online child sexual abuse’ should cover not only the dissemination of material previously detected and confirmed as constituting child sexual abuse material (‘known’ material), but also of material not previously detected that is likely to constitute child sexual abuse material but that has not yet beenbut since confirmed as such (‘new’ material), as well as activities constituting the solicitation of children (‘grooming’). That is needed in order to address not only past abuse, the re- victimisation and violation of the victims’ rights it entails, such as those to privacy and protection of personal data, but to also address recent, ongoing and imminent abuse, so as to prevent it as much as possible, to effectively protect children and to increase the likelihood of rescuing victims and stopping perpetrators.
2023/05/08
Committee: FEMM
Amendment 83 #
Proposal for a regulation
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation. Providers should also assess the reasonably foreseeable negative impacts of proposed mitigation measures, and if they disproportionately affect a group of people on the basis of sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender or sexual orientation. Particular care should be taken to assess the impact on girls, who are at a greater risk of being subject to child sexual abuse.
2023/05/08
Committee: FEMM
Amendment 84 #
Proposal for a regulation
Recital 16 a (new)
(16a) Parental controls that allow parents to access children’s private correspondence without their consent pose a significant risk to children’s privacy, but could also put at risk their safety, in particular in the cases of children who are being abused and who are trying to seek help, and LGBTQI+ children in hostile households. Therefore no provision in this legislation should enable or facilitate intrusions on children’s privacy.
2023/05/08
Committee: FEMM
Amendment 86 #
Proposal for a regulation
Recital 17
(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect or prevent online child sexual abuse in their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authority.
2023/05/08
Committee: FEMM
Amendment 89 #
Proposal for a regulation
Recital 19
(19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable measures to assess and mitigate that risk. The providers should make that assessment in a diligent manner, making efforts that are reasonable under the given circumstances, having regard inter alia to the nature and extent of that risk as well as their financial and technological capabilities and size, and cooperating with the providers of the services offered through the software application where possibleprovide parents with information on the features in applications that present a risk to children, as well as age and gender- sensitive guidance on how to discuss those risks with children.
2023/05/08
Committee: FEMM
Amendment 91 #
Proposal for a regulation
Recital 19 a (new)
(19a) Regulation (EU) 2022/1925 (the Digital Markets Act) sets out provisions to ensure competition in mobile device ecosystems, which would allow citizens to install software on their mobile devices directly, without using software application stores, bypassing age verification at the level of software application stores. Therefore manufacturers of operating systems deemed as gatekeepers under the Digital Markets Act should provide an application programming interface through which applications can request age verification, either through the European Digital Identity Wallet as defined in Regulation (EU) No XXX/2023 establishing a framework for a European Digital Identity, or through a third-party service. Manufacturers of operating systems deemed as gatekeepers should also provide a service to process age- verification requests in a manner that respects the privacy of the user and does not store a record of the services they accessed.
2023/05/08
Committee: FEMM
Amendment 93 #
Proposal for a regulation
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards, and targeted only to individuals suspected of child sexual abuse . For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.
2023/05/08
Committee: FEMM
Amendment 96 #
Proposal for a regulation
Recital 22
(22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected, in particular girls, ethnic and sexual minorities. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
2023/05/08
Committee: FEMM
Amendment 97 #
Proposal for a regulation
Recital 24
(24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, should be in a position to take a well-informed decision on requests for the issuance of detections orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especially in connection to detection orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question. They should do so as soon as possible, having regard to the important public policy objective at stake and the need to act without undue delay to protect children. In particular, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for providinge their opinions in response to a prior consultation. Furthermore, they should normally be able to provide their opinion well within that time period in situations where the European Data Protection Board has already issued guidelines regarding the technologies that a provider envisages deploying and operating to execute a detection order addressed to it under this Regulation as quickly as possible.
2023/05/08
Committee: FEMM
Amendment 101 #
Proposal for a regulation
Recital 26 a (new)
(26a) Detection of child sexual abuse in end-to-end encrypted communications is only possible by scanning those communications before they leave the abuser's device, however this would allow abusers to interfere with the scanning process. Abusers often work in groups, allowing for rapid proliferation of technology to bypass scanning, rendering such scanning ineffective. Therefore, taking into account the limited efficacy, and the negative impact on citizens' fundamental rights, detection orders should not be applicable to end-to-end encrypted communications.
2023/05/08
Committee: FEMM
Amendment 103 #
Proposal for a regulation
Recital 27
(27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers detection technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detection orders addressed to them. The European Data Protection Board should be consulted on those technologies and the ways in which they should be best deployed to ensure compliance with applicable rules of Union law on the protection of personal data. The advice of the European Data Protection Board should be taken into account by the EU Centre when compiling the lists of available technologies and also by the Commission when preparing guidelines regarding the application of the detection obligations. The providers may operate the technologies made available by the EU Centre or by others or technologies that they developed themselves, as long as they meet the requirements of this Regulation.
2023/05/08
Committee: FEMM
Amendment 106 #
Proposal for a regulation
Recital 30
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection,ongoing investigation and prosecution of specific child sexual abuse offences.
2023/05/08
Committee: FEMM
Amendment 107 #
Proposal for a regulation
Recital 32
(32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited. For legal and practical reasons, it may not be reasonably possible to have those providers remove or disable access to the material, not even through cooperation with the competent authorities of the third country where they are established. Therefore, in line with existing practices in several Member States, it should be possible to require providers of internet access services to take reasonable measures to block the access of users in the Union to the material. However, blocking measures are easily bypassed, and do not prevent access from outside of the Union, meaning victims have to live knowing that abuse material depicting them remains online, therefore every effort should be taken to remove material, even outside of the jurisdiction of the Union, before resorting to blocking.
2023/05/08
Committee: FEMM
Amendment 109 #
Proposal for a regulation
Recital 33
(33) In the interest of consistency, efficiency and effectiveness and to minimise the risk of circumvention, such blocking orders should be based on the list of uniform resource locators, leading to specific items of verified child sexual abuse, compiled and provided centrally by the EU Centre on the basis of diligently verified submissions by the relevant authorities of the Member States. In order to avoid the taking of unjustified or disproportionate measures, especially those that would unduly affect the fundamental rights at stake, notably, in addition to the rights of the children, the users’ freedom of expression and information and the providers’ freedom to conduct a business, appropriate limits and safeguards should be provided for. In particular, it should be ensured that the burdens imposed on the providers of internet access services concerned are not unreasonable, that the need for and proportionality of the blocking orders is diligently assessed also after their issuance and that both the providers and the users affected have effective means of judicial as well as non- judicial redress. Blocking by uniform resource locator is not technically possible, and most blocking is implemented at the level of the web domain, or Internet Protocol address, which often results in significant overblocking, therefore the EU Centre should evaluate the risk and impact of overblocking before making a final decision on blocking.
2023/05/08
Committee: FEMM
Amendment 118 #
Proposal for a regulation
Recital 37
(37) To ensure the efficient management of such victim support functions, victims should be allowed to contact and rely on the Coordinating Authority that is most accessible to them, which should channel all communications between victims and the EU Centre. Coordinating authorities should provide gender- and age- sensitive support to victims, as well as psychological support. Under no circumstances should victims be blamed for what has happened to them.
2023/05/08
Committee: FEMM
Amendment 128 #
Proposal for a regulation
Recital 61
(61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different child sexual abuse material (new material), or the solicitation of children, as applicable.
2023/05/08
Committee: FEMM
Amendment 144 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
(ba) 'safety assistant' means a tool integrated into interpersonal communications services either voluntarily or following a preventative detection order, and active only for child users of the service, which assists children in learning about, identifying and avoiding risks online, including but not limited to self-generated abuse material and solicitation;
2023/05/08
Committee: FEMM
Amendment 157 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
- the integration of tools such as safety assistants to prevent child sexual abuse online;
2023/05/08
Committee: FEMM
Amendment 166 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii a (new)
(iiia) the existing measures to mitigate risks that functionalities of the application will be used for the solicitation of children, or for the sharing of abuse material, including but not limited to safety assistants, and safe defaults for visibility and reachability of children on the platform;
2023/05/08
Committee: FEMM
Amendment 172 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonablpublicly available number- independent interpersonal communications services where there is substantial evidence that their service is routinely or systematically used for the purpose of online child sexual abuse shall take proportionate and effective mitigation measures, tailored to the serious risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
2023/05/08
Committee: FEMM
Amendment 173 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its t insofar as any changes are in full respect for the security and technical integrity of the service in order to protect the right to confidential communications of userms and conditions;that are not suspected of online child sexual abuse, and do not amount to general monitoring, nor indisciriminate data retention
2023/05/08
Committee: FEMM
Amendment 174 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
(aa) providing technical measures and tools that allow users, and in particular children, to manage their own privacy, visibility, reachability and safety , and that are set to the most secure levels by default;
2023/05/08
Committee: FEMM
Amendment 175 #
Proposal for a regulation
Article 4 – paragraph 1 – point a b (new)
(ab) new informing users, keeping in mind children’s needs, about external resources and services in the user’s region on preventing child sexual abuse, counselling by help-lines, information on victim support and educational resources provided by hotlines and child protection organisations;
2023/05/08
Committee: FEMM
Amendment 176 #
Proposal for a regulation
Article 4 – paragraph 1 – point a c (new)
(ac) New providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line
2023/05/08
Committee: FEMM
Amendment 177 #
Proposal for a regulation
Article 4 – paragraph 1 – point a d (new)
(ad) automatic mechanisms and interface design elements to inform users about external preventive intervention programmes without prejudice to the prohibition of profiling under Article 22 GDPR and the processing of sensitive data under Article 9 GDPR
2023/05/08
Committee: FEMM
Amendment 178 #
Proposal for a regulation
Article 4 – paragraph 1 – point b
(b) reinforcadapting the provider’s internal processes or the internal supervision of the functioning of the service;
2023/05/08
Committee: FEMM
Amendment 179 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communication services, public authorities, civil society organisations, hotlines or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
2023/05/08
Committee: FEMM
Amendment 180 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
(ca) Providing users of interpersonal communications services, in particular children, with tools to help them learn about, identify and avoid online risks, in particular through the integration of safety assistants.
2023/05/08
Committee: FEMM
Amendment 181 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
(ca) without breaking, weakening, circumventing or otherwise undermining end-to-end encryption in the sense of people’s right to confidential communications;.
2023/05/08
Committee: FEMM
Amendment 182 #
Proposal for a regulation
Article 4 – paragraph 2 – point a
(a) effective and proportionate in mitigating the identified serious risk;
2023/05/08
Committee: FEMM
Amendment 183 #
Proposal for a regulation
Article 4 – paragraph 2 – point a a (new)
(aa) new subject to an implementation plan with clear objectives and methodologies for identifying and quantifying impacts on the identified serious risk and on the exercise of the fundamental rights of all affected parties. The implementation plan shall be reviewed every six months.
2023/05/08
Committee: FEMM
Amendment 185 #
Proposal for a regulation
Article 4 – paragraph 2 – point b
(b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk , any impact on the functionality of the service as well as the provider’s financial and technological capabilities and the number of users;
2023/05/08
Committee: FEMM
Amendment 186 #
Proposal for a regulation
Article 4 – paragraph 2 – point c
(c) applied in a diligent and non- discriminatory manner, having due regardassessing, in all circumstances, to the potential consequences of the mitigationspecific measures for the exercise of fundamental rights of all parties affected;
2023/05/08
Committee: FEMM
Amendment 187 #
Proposal for a regulation
Article 4 – paragraph 2 – point d a (new)
(da) only introduced following an assessment of the risks the mitigating measures themselves pose for users, in particular if these risks would disproportionately negatively affect persons on the basis of sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender or sexual orientation;
2023/05/08
Committee: FEMM
Amendment 188 #
Proposal for a regulation
Article 4 – paragraph 2 – point d b (new)
(db) developed in cooperation with children who use the service;
2023/05/08
Committee: FEMM
Amendment 189 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures.deleted
2023/05/08
Committee: FEMM
Amendment 190 #
Proposal for a regulation
Article 4 – paragraph 3 a (new)
3a. Any requirement to take specific measures shall be without prejudice to Article 8 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] and shall entail neither a general obligation for hosting services providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.
2023/05/08
Committee: FEMM
Amendment 191 #
Proposal for a regulation
Article 4 – paragraph 4
4. PWhere appropriate, providers of hosting services and providers of interpersonal communications services shall clearly describe in their terms and conditionsof service, keeping in mind different possibilities of interpretation, the mitigation measures that they have taken. That description shall not include information that mais likely to significantly reduce the effectiveness of the mitigation measures.
2023/05/08
Committee: FEMM
Amendment 192 #
Proposal for a regulation
Article 4 – paragraph 4 a (new)
4a. Specific measures for platforms primarily used for the dissemination of pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure a. user-friendly reporting mechanisms to report alleged child sexual abuse material; b. adequate professional human content moderation to rapidly process notices of alleged child sexual abuse material; c. automatic mechanisms and interface design elements to inform users about external preventive intervention programmes in the user’s region.
2023/05/08
Committee: FEMM
Amendment 193 #
Proposal for a regulation
Article 4 – paragraph 4 b (new)
4b. Specific measures for number- independent interpersonal communications service within games Providers of online games that operate number-independent interpersonal communications service within their games shall take the necessary technical and organisational measures a) preventing users from initiating unsolicited contact with other users b) facilitating user-friendly reporting of alleged child sexual abuse material; c) providing technical measures and tools that allow users to manage their own privacy, visibility reachability and safety. and that are set to the most secure levels by default d) providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line
2023/05/08
Committee: FEMM
Amendment 195 #
Proposal for a regulation
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities , the European Data Protection Board and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 42, having due regard in particular to relevant technological developments, trends reported by authorities, civil society organisations and victim support organisations and in the manners in which the services covered by those provisions are offered and used.
2023/05/08
Committee: FEMM
Amendment 196 #
Proposal for a regulation
Article 4 – paragraph 5 a (new)
5a. To complement the risk mitigation measures taken by the providers, gender- sensitive and child-friendly education and prevention measures shall be implemented.
2023/05/08
Committee: FEMM
Amendment 197 #
Proposal for a regulation
Article 6
Obligations for software application 1. Providers of software application stores shall: (a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children; (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; (c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b). 2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3. 3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures. 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.Article 6 deleted stores
2023/05/08
Committee: FEMM
Amendment 198 #
Proposal for a regulation
Article 6 – paragraph 1 – point a
(a) make reasonable efforts to assess, where possible together with the providers of software applications, wheensure that software applications can only make available on their platform software applications if prior to the use of their each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children;they have obtained the contact details of the provider of software application developing team, without prejudice to Open Source Software, where this may not be possible
2023/05/08
Committee: FEMM
Amendment 199 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;deleted
2023/05/08
Committee: FEMM
Amendment 201 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
(c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b).deleted
2023/05/08
Committee: FEMM
Amendment 202 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
1a. Security of communications and services Nothing in this regulation shall be construed as requiring or encouraging the prohibition, restriction, circumvention or undermining of the provision or the use of encrypted services.
2023/05/08
Committee: FEMM
Amendment 203 #
Proposal for a regulation
Article 6 – paragraph 2
2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3.deleted
2023/05/08
Committee: FEMM
Amendment 204 #
Proposal for a regulation
Article 6 – paragraph 3
3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures.deleted
2023/05/08
Committee: FEMM
Amendment 205 #
Proposal for a regulation
Article 6 – paragraph 4
4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.deleted
2023/05/08
Committee: FEMM
Amendment 207 #
Proposal for a regulation
Article 6 a (new)
Article6a Obligations concerning age verification and for software application stores 1. Providers of software application stores considered as gatekeepers under the Digital Markets Act (EU) 2022/1925 shall: (a) indicate if applications contain features that could pose a risk to children; (b) indicate if measures have been taken to mitigate risks for children, and which measures have been taken; (c) provide guidance for parents on how to discuss risks with their children; (d) provide application developers with an open-source software library that enables age verification requests from inside applications both to European Digital Identity Wallets and third-party services; (e) provide, free of charge, an age- verification service that can respond to age verification requests from inside applications. 2. Providers of European Digital Identity Wallets under the Regulation (EU) No XXX/2023 establishing a framework for a European Digital Identity shall ensure European Digital Identity Wallets can respond to age verification requests from applications without revealing the identity of the user. 3. Third-party age verification services used to fulfil the obligations of this article shall: (a) only retain user personal data for the purpose of fulfilling future requests, and with the explicit consent of the user; (b) Only retain data vital to process future verification request, namely: i. a pseudonymous means of authenticating the user; and ii. the users previously verified date of birth. (c) only use this data of the purpose of age verification; (d) fulfil requests for the deletion of this data pursuant to the GDPR; 4. Where Developers of applications have identified a significant risk of use of the service concerned for the purpose of the solicitation of children, they shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to put in place safeguards, namely: (a) take reasonable measures to mitigate the risk, such as adapting the services to children, integrating a safety assistant or modifying or adding safeguards limiting access to certain features; (b) provide children with guidance on risks that will help them identify dangers and make more informed decisions; (c) where the application is manifestly unsuitable for children and cannot be adapted, prevent access. 5. Age verification mechanisms set out in this article shall not be used for the purposes of enabling or facilitating parental control technologies that give access to children’s private communications without their consent.
2023/05/08
Committee: FEMM
Amendment 209 #
Proposal for a regulation
Article 7 – title
7 Issuance of detecinvestigation orders
2023/05/08
Committee: FEMM
Amendment 210 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent independent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detecto issue an investigation order requiring a provider of hosting services or a provider of publicly available number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detectassist in investigations of a suspected specific person, specific group of people, or a specific incident related to online child sexual abuse on a specific service.
2023/05/08
Committee: FEMM
Amendment 211 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
1a. The Coordinating Authority of establishment shall choose one of the following types of detection order: (a). proactive detection orders, which detect and report known child sexual abuse material under the measures specified in Article 10; (b). preventative detection orders, which detect solicitation and attempts by children to share self-generated abuse material, and assist them in avoiding risks, under the measures specified in Article 10;
2023/05/08
Committee: FEMM
Amendment 212 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 1
The Coordinating Authority of establishment shall, before requesting the issuance of a detecn investigation order, carry out the investigations and assessments necessary to determine whether the conditions of paragraph 4 have been met.
2023/05/08
Committee: FEMM
Amendment 213 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 2
To that end, it may, where appropriate, require the provider to submit the necessary information, additional to the report and the further information referred to in Article 5(1) and (3), respectively, within a reasonable time period set by that Coordinating Authority, or request the EU Centre, another public authority or relevant experts or entities to provide the necessary additional information.
2023/05/08
Committee: FEMM
Amendment 214 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point a
(a) establish a draft request for the issuance of a detecn investigation order, specifying the factual and legal grounds upon which the request is based, the main elements of the content of the detecinvestigation order it intends to request and the reasons for requesting it;
2023/05/08
Committee: FEMM
Amendment 215 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point c
(c) afford the provider an opportunity to comment on the draft request, within a reasonable time period set by that Coordinating Authority;deleted
2023/05/08
Committee: FEMM
Amendment 216 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – introductory part
Where, having regard to the comments of the provider and the opinion of the EU Centre, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 havare met, it shall re-submit the draft request, adjusted where appropriate, to the provider. In that case, the provider shallquest the judicial validation of the inquiry/investigation order from the competent judicial authority responsible for the issuing of such orders pursuant to paragraph 4. Upon receipt of judicial validation of the order, the Coordinating Authority shall submit the order, adjusted where appropriate, to the provider. Prior to requesting the judicial validation of the investigation order, the Coordinating Authority shall request the provider to do all of the following, within a reasonable time period set by that Coordinating Authority:
2023/05/08
Committee: FEMM
Amendment 217 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point a
(a) draft an implementation plan setting out the incident that the authority intends to investigate, the measures it envisages taking to execute the intended detecinvestigation order, including detailed information regarding the envisaged technologies and safeguards;
2023/05/08
Committee: FEMM
Amendment 218 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point b
(b) where the draft implementation plan concerns an intended detection order concerning the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct a data protection impact assessment and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation plan;
2023/05/08
Committee: FEMM
Amendment 221 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point c
(c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessment and in order to take intoutmost account of the opinion of the data protection authority provided in response to the prior consultation;
2023/05/08
Committee: FEMM
Amendment 222 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point d
(d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted in viewto take full account of the outcome of the data protection impact assessment and of that opinion.
2023/05/08
Committee: FEMM
Amendment 224 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
TBased on reasoned justification, the Coordinating Authority of establishment shall request the issuance of the detecinvestigation order, and the competent judicial authority or independent administrative authority shall issue the detecinvestigation order where it considers that the following conditions are met:
2023/05/08
Committee: FEMM
Amendment 225 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a
(a) there is evidence of a signpecificant risk of the service being used by one or more specific suspects for the purpose of online child sexual abuse, within the meaning of paragraphs 5, 6 and 7, as applicable;
2023/05/08
Committee: FEMM
Amendment 226 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b
(b) the reasons for issuing the detecinvestigation order outweighare necessary and proportionate and minimise negative consequences for the rights and legitimate interests of all parties affected, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties.to protect the rights to privacy, data protection, free expression and access to information of users that are not suspects of online child sexual abuse, including child users
2023/05/08
Committee: FEMM
Amendment 227 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 a (new)
(c) nothing in the investigation order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider as a whole.
2023/05/08
Committee: FEMM
Amendment 228 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2
When assessing whether the conditions of the first subparagraph have been met, account shall be taken of all relevant facts and circumstances of the case at hand, in particular: (a) the risk assessment conducted or updated and any mitigation measures taken by the provider pursuant to Articles 3 and 4, including any mitigation measures introduced, reviewed, discontinued or expanded pursuant to Article 5(4) where applicable; (b) any additional information obtained pursuant to paragraph 2 or any other relevant information available to it, in particular regarding the use, design and operation of the service, regarding the provider’s financial and technological capabilities and size and regarding the potential consequences of the measures to be taken to execute the detection order for all other parties affected; (c) the views and the implementation plan of the provider submitted in accordance with paragraph 3; (d) the opinions of the EU Centre and of the data protection authority submitted in accordance with paragraph 3.deleted
2023/05/08
Committee: FEMM
Amendment 229 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point a
(a) the risk assessment conducted or updated and any mitigation measures taken by the provider pursuant to Articles 3 and 4, including any mitigation measures introduced, reviewed, discontinued or expanded pursuant to Article 5(4) where applicable;deleted
2023/05/08
Committee: FEMM
Amendment 230 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point b
(b) any additional information obtained pursuant to paragraph 2 or any other relevant information available to it, in particular regarding the use, design and operation of the service, regarding the provider’s financial and technological capabilities and size and regarding the potential consequences of the measures to be taken to execute the detection order for all other parties affected;deleted
2023/05/08
Committee: FEMM
Amendment 231 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point c
(c) the views and the implementation plan of the provider submitted in accordance with paragraph 3;deleted
2023/05/08
Committee: FEMM
Amendment 232 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point d
(d) the opinions of the EU Centre and of the data protection authority submitted in accordance with paragraph 3.deleted
2023/05/08
Committee: FEMM
Amendment 233 #
Proposal for a regulation
Article 7 – paragraph 5 – introductory part
5. As regards detecinvestigation orders concerning the dissemination of known child sexual abuse material, the signpecificant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met:
2023/05/08
Committee: FEMM
Amendment 234 #
Proposal for a regulation
Article 7 – paragraph 5 – point a
(a) it is likely, despite any mitigation measures that the provider may have taken or will take, that the service is used, to an appreciable extentbeing used by the suspect or suspects of child sexual abuse for the dissemination of known child sexual abuse material;
2023/05/08
Committee: FEMM
Amendment 235 #
Proposal for a regulation
Article 7 – paragraph 5 – point b
(b) there is evidence of the service, or of a comparable service if the service has not yet having been offerused in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extentpast 12 months by one or more suspects of child sexual abuse for the dissemination of known child sexual abuse material.
2023/05/08
Committee: FEMM
Amendment 236 #
Proposal for a regulation
Article 7 – paragraph 6
6. As regards detection orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the dissemination of new child sexual abuse material; (c) for services other than those enabling the live transmission of pornographic performances as defined in Article 2, point (e), of Directive 2011/93/EU: (1) a detection order concerning the dissemination of known child sexual abuse material has been issued in respect of the service; (2) the provider submitted a significant number of reports concerning known child sexual abuse material, detected through the measures taken to execute the detection order referred to in point (1), pursuant to Article 12.deleted
2023/05/08
Committee: FEMM
Amendment 237 #
Proposal for a regulation
Article 7 – paragraph 6 – introductory part
6. As regards detecinvestigation orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met:
2023/05/08
Committee: FEMM
Amendment 238 #
Proposal for a regulation
Article 7 – paragraph 6 – point a
(a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extentby one or more suspects of child sexual abuse, for the dissemination of new child sexual abuse material;
2023/05/08
Committee: FEMM
Amendment 239 #
Proposal for a regulation
Article 7 – paragraph 6 – point b
(b) there is evidence of the service, or of a comparable service if the service has not yethaving been offerused in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extentpast 12 months by one or more suspects of child sexual abuse , for the dissemination of new child sexual abuse material;
2023/05/08
Committee: FEMM
Amendment 240 #
Proposal for a regulation
Article 7 – paragraph 6 – point c – point 1
(1) a detecinvestigation order concerning the dissemination of known child sexual abuse material has been issued in respect of the service;
2023/05/08
Committee: FEMM
Amendment 241 #
(2) the provider submitted a significant number of reports concerning known child sexual abuse material, detected through the measures taken to execute the detection order referred to in point (1), pursuant to Article 12.deleted
2023/05/08
Committee: FEMM
Amendment 242 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1 – introductory part
As regards preventative detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met:
2023/05/08
Committee: FEMM
Amendment 243 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1 – introductory part
As regards detecinvestigation orders concerning the solicitation of children, the signpecificant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met:
2023/05/08
Committee: FEMM
Amendment 244 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1 – point a
(a) the provider qualifies as a provider of publicly available number-independent interpersonal communication services;
2023/05/08
Committee: FEMM
Amendment 245 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1 – point b
(b) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extentby one or more suspects of child sexual abuse, for the solicitation of children;
2023/05/08
Committee: FEMM
Amendment 246 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1 – point c
(c) there is evidence of the service, or of a comparable service if the service has not yethaving been offerused in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extentpast 12 months by one or more suspects of child sexual abuse, for the solicitation of children.
2023/05/08
Committee: FEMM
Amendment 247 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 2
The detection orders concerning the solicitation of children shall apply only to interpersonal communications where one of the users is a child user.deleted
2023/05/08
Committee: FEMM
Amendment 248 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the judicial validation and the issuance of detecinvestigation orders, and the competent judicial or independent administrative authority when issuing the detecinvestigation order, shall target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary to effectively address the significant risk referred to in point (a) thereofand proportionate to obtain the information required to effectively investigate the case , and collect the information required to assess the existence of a criminal case.
2023/05/08
Committee: FEMM
Amendment 249 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
To that aimend, they shall take into account all relevant parameters, including the availability of sufficiently reliable detectioninvestigative technologies in that they limit to the maximum extent possible the rate of errors regarding the detecinvestigation and their suitability and effectiveness for achieving the objectives of this Regulation, as well as the impact of the measures on the rights of the users affected, and require the taking of the least intrusive measures, in accordance with Article 10, from among several equally effective measures.
2023/05/08
Committee: FEMM
Amendment 250 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point a
(a) where that riske suspicion of online child sexual abuse by one or more specific individuals is limited to an identifiable part or component of a service, the required measures are only applied in respect of that part or component;
2023/05/08
Committee: FEMM
Amendment 251 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point b
(b) where necessary, in particular to limit such negative consequences, effective and proportionate safeguards additional to those listed in Article 10(4), (5) and (6) are provided for;
2023/05/08
Committee: FEMM
Amendment 252 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 a (new)
(d) nothing in the investigation order can be construed as requiring or encouraging the provider to weaken, break, circumvent or otherwise undermine or limit the encryption, security, or other means of protecting the confidentiality of communications, of the platform or service of the provider.
2023/05/08
Committee: FEMM
Amendment 253 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 1
The competent judicial authority or independent administrative authority shall specify in the detecinvestigation order the period during which it applies, indicating the start date and the end date.
2023/05/08
Committee: FEMM
Amendment 254 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 2
The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the detecinvestigation order. It shall not be earlier than three months from the date at which the provider received the detecinvestigation order and not be later than 12 months from that date.
2023/05/08
Committee: FEMM
Amendment 255 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3
The period of application of detection orders concerning the dissemination of known or new child sexual abuse material shall not exceed 24 months and that of detection orders concerning the solicitation of children shall not exceed 12 monthsinvestigation orders shall be proportionate, taking all relevant factors into account.
2023/05/08
Committee: FEMM
Amendment 256 #
Proposal for a regulation
Article 8 – title
8 Additional rules regarding detecinvestigation orders
2023/05/08
Committee: FEMM
Amendment 257 #
Proposal for a regulation
Article 8 – title
8 Additional rules regarding detecinvestigation orders
2023/05/08
Committee: FEMM
Amendment 258 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detecinvestigation orders referred to in Article 7 using the template set out in Annex I. DetecInvestigation orders shall include:
2023/05/08
Committee: FEMM
Amendment 259 #
Proposal for a regulation
Article 8 – paragraph 1 – point a
(a) information regarding the measures to be taken to execute the detecinvestigation order, including on the suspect or group of suspects or incident concerned, the temporal scope, the indicators to be used and the safeguards to be provided for, including the reporting requirements set pursuant to Article 9(3) and, where applicable, any additional safeguards as referred to in Article 7(8); All rights with respect to being a suspect under EU law must be upheld.
2023/05/08
Committee: FEMM
Amendment 260 #
Proposal for a regulation
Article 8 – paragraph 1 – point b
(b) identification details of the competent judicial authority or the independent administrative authority issuing the detecinvestigation order and authentication of the detecat investigation order by that judicial or independent administrative authority;
2023/05/08
Committee: FEMM
Amendment 261 #
Proposal for a regulation
Article 8 – paragraph 1 – point d
(d) the specific service in respect of which the detecinvestigation order is issued and, where applicable, the part or component of the service affected as referred to in Article 7(8);
2023/05/08
Committee: FEMM
Amendment 262 #
Proposal for a regulation
Article 8 – paragraph 1 – point d a (new)
(da) the type of detection order;
2023/05/08
Committee: FEMM
Amendment 263 #
Proposal for a regulation
Article 8 – paragraph 1 – point e
(e) whether the detecinvestigation order issued concerns the possible dissemination of known or newpreviously unknown child sexual abuse material or the solicitation of children;
2023/05/08
Committee: FEMM
Amendment 264 #
Proposal for a regulation
Article 8 – paragraph 1 – point f
(f) the start date and the end date of the detecinvestigation order;
2023/05/08
Committee: FEMM
Amendment 265 #
Proposal for a regulation
Article 8 – paragraph 1 – point g
(g) a sufficiently detailed statement of reasons explaining why the detecinvestigation order is issued;
2023/05/08
Committee: FEMM
Amendment 266 #
Proposal for a regulation
Article 8 – paragraph 1 – point h
(h) the factual and legal grounds justifying the issuing of the order, and a reference to this Regulation as the legal basis for the detecinvestigation order;
2023/05/08
Committee: FEMM
Amendment 267 #
Proposal for a regulation
Article 8 – paragraph 1 – point i
(i) the date, time stamp and electronic signature of the judicial or independent administrative authority issuing the detecinvestigation order;
2023/05/08
Committee: FEMM
Amendment 268 #
Proposal for a regulation
Article 8 – paragraph 1 – point j
(j) easily understandable information about the redress available to the addressee of the detecinvestigation order, including information about redress to a court and about the time periods applicable to such redress.
2023/05/08
Committee: FEMM
Amendment 269 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1
The competent judicial authority or independent administrative authority issuing the detecinvestigation order shall address it to the main establishment of the provider or, where applicable, to its legal representative designated in accordance with Article 24.
2023/05/08
Committee: FEMM
Amendment 270 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 2
The detecinvestigation order shall be securely transmitted to the provider’s point of contact referred to in Article 23(1), to the Coordinating Authority of establishment and to the EU Centre, through the system established in accordance with Article 39(2).
2023/05/08
Committee: FEMM
Amendment 271 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 3
The detecinvestigation order shall be drafted in the language declared by the provider pursuant to Article 23(3).
2023/05/08
Committee: FEMM
Amendment 272 #
Proposal for a regulation
Article 8 – paragraph 3
3. If the provider cannot execute the detecinvestigation order because it contains manifest errorserrors or it appears unnecessary or disproportionate, in particular on the rights and freedoms of persons not reasonably suspected of online child sexual abuse or does not contain sufficient information for its execution, the provider shall, without undue delay, request the necessary correction or clarification to the Coordinating Authority of establishment, using the template set out in Annex II.
2023/05/08
Committee: FEMM
Amendment 273 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
4a. Scope of preservation orders Investigation orders may require the expedited preservation by the provider, insofar as the data is under their control, and accessible without the need to alter the design or operation of their platform or service, of one or more of the following, including new data generated after issuance of the order, as part of a planned or current law enforcement operation with respect for the principle of data minimisation, and without prejudice to requirements to comply with laws on data retention; a) Traffic data: i) Pseudonyms, screen names or other identifiers used by the subject(s) of the investigation; ii) Network identifiers, such as IP addresses, port numbers, or MAC addresses used by, or associated with, the subject(s) of the investigation; iii) Any other traffic data, including metadata, of any activity linked to subject(s) of the investigation; b) Content data: i) Copies of any data uploaded, downloaded or otherwise communicated by the subject(s) of the investigation; 2. Access to the data shall be made available to law enforcement authorities on the basis of the national law of the country of establishment of the provider. 3. The provider shall inform all users concerned of the investigation order, unless the issuing authority instructs it, on the basis of a reasoned opinion, not to do so. 4. Investigation orders may require providers to provide support for law enforcement authorities. Such support shall be technically feasible, clearly defined, subject to specific judicial oversight, necessary proportionate and respect the fundamental rights of all parties involved. 5. Such orders shall be strictly time limited, and all data shall be deleted by the provider as soon as it has been successfully transferred to the relevant law enforcement agency, or the investigation is closed, or the subjects of the order deemed to be no longer of interest;
2023/05/08
Committee: FEMM
Amendment 274 #
Proposal for a regulation
Article 8 – paragraph 4 b (new)
4b. Notification mechanism 1. Providers of hosting services and providers of interpersonal communication services shall establish mechanisms that allow users to notify to them the presence on their service of specific items or activities that the user considers to be potential child sexual abuse material, in particular previously unknown child sexual abuse material and solicitation of children. Those mechanisms shall be easy to access and user-friendly, child-friendly and shall allow for the submission of notices exclusively by electronic means. 2. Where the notice contains the electronic contact information of the user who submitted it , the provider shall without undue delay send a confirmation or receipt to the user. 3. Providers shall ensure that such notices are processed without undue delay.
2023/05/08
Committee: FEMM
Amendment 275 #
Proposal for a regulation
Article 9 – title
9 Redress, information, reporting and modification of detecinvestigation orders
2023/05/08
Committee: FEMM
Amendment 276 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services that have received a detecn investigation order, as well as usersthe suspect(s) affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detecinvestigation order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detecinvestigation order.
2023/05/08
Committee: FEMM
Amendment 277 #
Proposal for a regulation
Article 9 – paragraph 2 – subparagraph 1
When the detecinvestigation order becomes final, the competent judicial authority or independent administrative authority that issued the detecinvestigation order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).
2023/05/08
Committee: FEMM
Amendment 278 #
Proposal for a regulation
Article 9 – paragraph 2 – subparagraph 2
For the purpose of the first subparagraph, a detecn investigation order shall become final upon the expiry of the time period for appeal where no appeal has been lodged in accordance with national law or upon confirmation of the detecinvestigation order following an appeal.
2023/05/08
Committee: FEMM
Amendment 279 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 1
Where the period of application of the detecinvestigation order exceeds 12 months, or six months in the case of a detecn investigation order concerning the solicitation of children, the Coordinating Authority of establishment shall require the provider to report to it on the execution of the detecinvestigation order at least once, halfway through the period of application.
2023/05/08
Committee: FEMM
Amendment 280 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 2
Those reports shall include a detailed description of the measures taken to execute the detecinvestigation order, including the safeguards provided, and information on the functioning in practice of those measures, in particular on their effectiveness in detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, and on the consequences of those measures for the rights and legitimate interests of all parties affected.
2023/05/08
Committee: FEMM
Amendment 280 #
Proposal for a regulation
The European Parliament rejects the Commission proposal (COM(2022)0209).
2023/07/28
Committee: LIBE
Amendment 281 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 1
In respect of the detecinvestigation orders that the competent judicial authority or independent administrative authority issued at its request, the Coordinating Authority of establishment shall, where necessary and in any event following reception of the reports referred to in paragraph 3, assess whether any substantial changes to the grounds for issuing the detecinvestigation orders occurred and, in particular, whether the conditions of Article 7(4) continue to be met. In that regard, it shall take account of additional mitigation measures that the provider may take to address the signpecificant risk identified at the time of the issuance of the detecinvestigation order.
2023/05/08
Committee: FEMM
Amendment 282 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 2
That Coordinating Authority shall request to the competent judicial authority or independent administrative authority that issued the detecinvestigation order the modification or revocation of such order, where necessary in the light of the outcome of that assessment. The provisions of this Section shall apply to such requests, mutatis mutandis.
2023/05/08
Committee: FEMM
Amendment 283 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that are not end-to end encrypted, and that have received a proactive detection order shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
2023/05/08
Committee: FEMM
Amendment 284 #
Proposal for a regulation
Article 10 – paragraph 1 a (new)
1a. 1a. Providers of hosting services and providers of interpersonal communication services that have received a preventative detection order shall execute it by integrating technologies into the software used to access their services in order to: (a). detect when children attempt to use their services to send intimate images, and provide: (i). guidance on the risks of sharing intimate images, in particular with strangers, and a design that discourages sharing; (ii). a disclaimer about the potential illegality of sharing intimate images, even with partners, when they are under 18; (iii). specific measures to reduce the likelihood images will be reshared, such as preventing images from leaving the software application, disallowing forwarding of images, and preventing screenshots; (iv). a “help” button, displayed prominently on any intimate images sent which the sender can use to retract the image, to seek help and advice, or to request the image hash be sent to a takedown service; (b). detect potential attempted solicitation of children using their services, and provide: (i). an age-appropriate warning about the conversation, strongly advising against continuing the conversation, and in particular, against sharing photos and personal information, and encouraging the child user to speak to a trusted adult; (ii). guidance for trusted adults in discussing the attempted solicitation, with an emphasis on building a relationship of trust between the child and the trusted adult; (iii). where the child does not feel comfortable sharing the conversation with a trusted adult, an option to ask moderators of the service for their advice; (iv). an option to block or report the user.
2023/05/08
Committee: FEMM
Amendment 285 #
Proposal for a regulation
Article 10 – paragraph 1 b (new)
1b. 1b. Technologies used in preventative detection orders to detect grooming shall only report detection in cases where the potential victim, trusted adult, or moderator explicitly choose to. Where end-to-end encryption is used the detection should be done entirely on the users’ device.
2023/05/08
Committee: FEMM
Amendment 286 #
Proposal for a regulation
Article 10 – paragraph 1 c (new)
1c. 1c. Technologies used in preventative detection orders to detect when children attempt to use their services to send intimate images shall not report these users in any way, Where end- to-end encryption is used the detection should be done entirely on the users’ device.
2023/05/08
Committee: FEMM
Amendment 287 #
Proposal for a regulation
Article 10 – paragraph 1 d (new)
1d. 1d. The Coordinating Authority shall be empowered to request services take further preventative measures so long as those measures do not involve reporting, and only after approval by the relevant Data Protection Authority.
2023/05/08
Committee: FEMM
Amendment 290 #
Proposal for a regulation
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of interpersonal communications services becomes awarepublicly available number-independent interpersonal communications services has actual knowledge of alleged online child sexual abuse on its services in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report using state of the art encryption thereon to the EU Centre in accordance with Article 13 and shall expeditiously disable access to such content, and remove such content once the EU Centre confirms this will not prejudice an ongoing investigation. It shall do so through the system established in accordance with Article 39(2).
2023/05/08
Committee: FEMM
Amendment 291 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 1
Where the provider submits a report pursuant to paragraph 1, it shall inform the user concerned, providingrequest authorisation from the EU Centre to inform the user concerned, where the Centre shall reply without undue delay, at maximum within two days. The notification to the user shall include information on the main content of the report, on the manner in which the provider has become aware of the potentialalleged child sexual abuse concerned, on the follow-up given to the report insofar as such information is available to the provider and on the user’s possibilities of redress, including on the right to submit complaints to the Coordinating Authority in accordance with Article 34.
2023/05/08
Committee: FEMM
Amendment 292 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 2
The provider shall inform the user concerned without undue delay, either after having received a communication from the EU Centre indicating that it considers the report to be manifestly unfounded as establish and operate an accessible and user-friendly mechanism with age-appropriate options that allows users to flag, anonymously if preferred, to in Article 48(2), or after the expiry of a time period of three months from the date of the report without having received a communication from the EU Centre indicating that the information is not to be provided as referred to in Article 48(6), point (a), whichever occurs firstthe provider potential online child sexual abuse or potential solicitation of children on the service.
2023/05/08
Committee: FEMM
Amendment 293 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 3
Where within the three months’ time period referred to in the second subparagraph the provider receives such a communication from the EU Centre indicating that the information is not to be provided, it shall inform the user concerned, without undue delay, after the expiry of the time period set out in that communication.deleted
2023/05/08
Committee: FEMM
Amendment 294 #
Proposal for a regulation
Article 12 – paragraph 3
3. The provider shall establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to flagwith age-appropriate options that allows users to flag, anonymously if preferred, to the provider potential online child sexual abuse or potential solicitation of children on the service.
2023/05/08
Committee: FEMM
Amendment 297 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
2023/05/08
Committee: FEMM
Amendment 298 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) all content data, including images, videos and text;encrypted versions of all child sexual abuse , being reported
2023/05/08
Committee: FEMM
Amendment 299 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) a list of all available data other than content data related to the potential online child sexual abuse preserved in line with the preservation order in article 8a;
2023/05/08
Committee: FEMM
Amendment 300 #
Proposal for a regulation
Article 13 – paragraph 1 – point d a (new)
(da) a list of all traffic data and metadata retained in relation to the potential online child sexual abuse, which could be made available to law enforcement authorities, together with information concerning default storage periods.
2023/05/08
Committee: FEMM
Amendment 301 #
Proposal for a regulation
Article 13 – paragraph 1 – point f
(f) information concerning the geographic location related to the potential online child sexual abuse, such as the Internet Protocol address;deleted
2023/05/08
Committee: FEMM
Amendment 302 #
Proposal for a regulation
Article 13 – paragraph 1 – point g
(g) information concerning the identity of any user involved in the potential online child sexual abuse;deleted
2023/05/08
Committee: FEMM
Amendment 303 #
Proposal for a regulation
Article 13 – paragraph 1 – point i
(i) where the potentiaalleged l online child sexual abuse concerns the dissemination of known or newpreviously unknown child sexual abuse material, whether the provider has removed or disabled access to the material;
2023/05/08
Committee: FEMM
Amendment 304 #
Proposal for a regulation
Article 13 – paragraph 1 – point j
(j) in indication whether the provider considers that the report requires urgent action;
2023/05/08
Committee: FEMM
Amendment 305 #
Proposal for a regulation
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hostingto issue a removal order requiring a provider of hosting services or publicly available number-independent interpersonal communications services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1)competent judicial authority identified as constituting child sexual abuse material.
2023/05/08
Committee: FEMM
Amendment 306 #
Proposal for a regulation
Article 14 – paragraph 1 a (new)
1a. Before requesting a removal order, the Coordinating Authority of establishment and competent judicial authority shall take all reasonable steps to ensure that implementing the order will not interfere with activities for the investigation and prosecution of child sexual abuse offences.
2023/05/08
Committee: FEMM
Amendment 307 #
Proposal for a regulation
Article 14 – paragraph 1 b (new)
1b. Removal orders shall be issued by judicial authorities in line with Article 9 on Orders to act against illegal content of the Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC.
2023/05/08
Committee: FEMM
Amendment 308 #
Proposal for a regulation
Article 14 – paragraph 3 – introductory part
3. The competent judicial authority or the independent administrative authority shall issue a removal order using the template set out in Annex IV. Removal orders shall include:
2023/05/08
Committee: FEMM
Amendment 308 #
Proposal for a regulation
Recital 5
(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. Online search engines and other artificial intelligence services should also be covered. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate mannerand targeted manner. Considering the fundamental importance of the right to respect for private life and the right to protection of personal data, as guaranteed by the Charter of Fundamental Rights, nothing in this regulation should be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications.
2023/07/28
Committee: LIBE
Amendment 309 #
Proposal for a regulation
Article 14 – paragraph 3 – point a
(a) identification details of the judicial or independent administrative authority issuing the removal order and authentication of the removal order by that authority;
2023/05/08
Committee: FEMM
Amendment 310 #
Proposal for a regulation
Article 14 – paragraph 3 – point c
(c) the specific service for which the removal order is issudeleted;
2023/05/08
Committee: FEMM
Amendment 311 #
Proposal for a regulation
Article 14 – paragraph 3 – point h
(h) the date, time stamp and electronic signature of the judicial or independent administrative authority issuing the removal order;
2023/05/08
Committee: FEMM
Amendment 312 #
Proposal for a regulation
Article 14 – paragraph 3 – point i
(i) easily understandable and accessible information about the redress options that the service has to make available to the addressee of the removal order in their language setting, including information about redress to a court and about the time periods applicable to such redress. , taking into account the different needs of people with a disability
2023/05/08
Committee: FEMM
Amendment 313 #
Proposal for a regulation
Article 14 – paragraph 4 – subparagraph 1
The judicial authority or the independent administrative issuing the removal order shall address it to the main establishment of the provider or, where applicable, to its legal representative designated in accordance with Article 24.
2023/05/08
Committee: FEMM
Amendment 314 #
Proposal for a regulation
Article 14 – paragraph 5 – subparagraph 1
If the provider cannot execute the removal order on grounds of force majeure or de facto impossibility not attributable to it, including for objectively justifiable technical or operational reasons, it shall, without undue delay, inform the Coordinating Authority of establishment of those grounds including evidence, using the template set out in Annex V.
2023/05/08
Committee: FEMM
Amendment 315 #
Proposal for a regulation
Article 14 – paragraph 6 – subparagraph 1
If the provider cannot execute the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, request the necessary clarification to the Coordinating Authority of establishment, using the template set out in Annex V. The Coordinating Authority shall reply without undue delay, at maximum within two days.
2023/05/08
Committee: FEMM
Amendment 316 #
Proposal for a regulation
Article 15 – paragraph 1
1. Providers of hosting services or publicly available number-independent interpersonal communications services that have received a removal order issued in accordance with Article 14, as well as the users who providedengaged with the material, shall have the right to an effective, and if applicable, collective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the removal order.
2023/05/08
Committee: FEMM
Amendment 317 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1
When the removal order becomes final, the competent judicial authority or independent administrative authority that issued the removal order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).
2023/05/08
Committee: FEMM
Amendment 318 #
Proposal for a regulation
Article 15 – paragraph 3 – point b
(b) the reasons for the removal or disabling, providing a copy of the removal order upon the user’s request;
2023/05/08
Committee: FEMM
Amendment 319 #
Proposal for a regulation
Article 15 – paragraph 3 – subparagraph 1 (new)
(d) information about external resources and services in the user’s region on preventing child sexual abuse, counselling by help-lines, information on victim support and educational resources provided by hotlines and child protection organisations; (e) this information will be provided in an easily understandable, accessible manner, in the language setting of the user, taking into account the different needs of persons with a disability.
2023/05/08
Committee: FEMM
Amendment 321 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 1
The Coordinating Authority of establishment may request, when requesting the judicial authority or independent administrative authority issuing the removal order, and after having consulted with relevant public authorities, that the provider is not to disclose any information regarding the removal of or disabling of access to the child sexual abuse material, where and to the extent necessary to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.deleted
2023/05/08
Committee: FEMM
Amendment 322 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 1
The Coordinating Authority of establishment may request, when requesting the judicial authority or independent administrative authority issuing the removal order, and after having consulted with relevant public authorities, that the provider is not to disclose any information regarding the removal of or disabling of access to the child sexual abuse material, where and to the extent necessary to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
2023/05/08
Committee: FEMM
Amendment 323 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 2
In such a case: (a) the judicial authority or independent administrative authority issuing the removal order shall set the time period not longer than necessary and not exceeding six weeks, during which the provider is not to disclose such information; (b) the obligations set out in paragraph 3 shall not apply during that time period; (c) that judicial authority or independent administrative authority shall inform the provider of its decision, specifying the applicable time period.deleted
2023/05/08
Committee: FEMM
Amendment 324 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 2 – point a
(a) the judicial authority or independent administrative authority issuing the removal order shall set the time period not longer than necessary and not exceeding six weeks, during which the provider is not to disclose such information;deleted
2023/05/08
Committee: FEMM
Amendment 325 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 2 – point b
(b) the obligations set out in paragraph 3 shall not apply during that time period;deleted
2023/05/08
Committee: FEMM
Amendment 326 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 2 – point c
(c) that judicial authority or independent administrative authority shall inform the provider of its decision, specifying the applicable time period.deleted
2023/05/08
Committee: FEMM
Amendment 327 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 3
That judicial authority or independent administrative authority may decide to extend the time period referred to in the second subparagraph, point (a), by a further time period of maximum six weeks, where and to the extent the non- disclosure continues to be necessary. In that case, that judicial authority or independent administrative authority shall inform the provider of its decision, specifying the applicable time period. Article 14(3) shall apply to that decision.deleted
2023/05/08
Committee: FEMM
Amendment 329 #
Proposal for a regulation
Article 16
[...]deleted
2023/05/08
Committee: FEMM
Amendment 330 #
Proposal for a regulation
Article 16 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or an independent administrative authority of that Member State to issue a blocking order requiring a provider of internet access services under the jurisdiction of that Member State to take reasonable measures to prevent users from accessing known child sexual abuse material indicated by all uniform resource locators on the list of uniform resource locators included in the database of indicators, in accordance with Article 44(2), point (b) and provided by the EU Centre.deleted
2023/05/08
Committee: FEMM
Amendment 331 #
Proposal for a regulation
Article 16 – paragraph 1
1. TOnce all other means available to remove abuse material have been exhausted, the Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or an independent administrative authority of that Member State to issue a blocking order requiring a provider of internet access services under the jurisdiction of that Member State to take reasonable measures to prevent users from accessing known child sexual abuse material indicated by all uniform resource locators on the list of uniform resource locators included in the database of indicators, in accordance with Article 44(2), point (b) and provided by the EU Centre.
2023/05/08
Committee: FEMM
Amendment 332 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 1
The Coordinating Authority of establishment shall, before requesting the issuance of a blocking order, carry out all investigations and assessments necessary to determine whether the conditions of paragraph 4 have been met.deleted
2023/05/08
Committee: FEMM
Amendment 333 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2
To that end, it shall, where appropriate: (a) verify that, in respect of all or a representative sample of the uniform resource locators on the list referred to in paragraph 1, the conditions of Article 36(1), point (b), are met, including by carrying out checks to verify in cooperation with the EU Centre that the list is complete, accurate and up-to-date; (b) require the provider to submit, within a reasonable time period set by that Coordinating Authority, the necessary information, in particular regarding the accessing or attempting to access by users of the child sexual abuse material indicated by the uniform resource locators, regarding the provider’s policy to address the risk of dissemination of the child sexual abuse material and regarding the provider’s financial and technological capabilities and size; (c) request the EU Centre to provide the necessary information, in particular explanations and assurances regarding the accuracy of the uniform resource locators in indicating child sexual abuse material, regarding the quantity and nature of that material and regarding the verifications by the EU Centre and the audits referred to in Article 36(2) and Article 46(7), respectively; (d) request any other relevant public authority or relevant experts or entities to provide the necessary information.deleted
2023/05/08
Committee: FEMM
Amendment 334 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2 – point a
(a) verify that, in respect of all or a representative sample of the uniform resource locators on the list referred to in paragraph 1, the conditions of Article 36(1), point (b), are met, including by carrying out checks to verify in cooperation with the EU Centre that the list is complete, accurate and up-to-date;deleted
2023/05/08
Committee: FEMM
Amendment 335 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2 – point b
(b) require the provider to submit, within a reasonable time period set by that Coordinating Authority, the necessary information, in particular regarding the accessing or attempting to access by users of the child sexual abuse material indicated by the uniform resource locators, regarding the provider’s policy to address the risk of dissemination of the child sexual abuse material and regarding the provider’s financial and technological capabilities and size;deleted
2023/05/08
Committee: FEMM
Amendment 336 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2 – point c
(c) request the EU Centre to provide the necessary information, in particular explanations and assurances regarding the accuracy of the uniform resource locators in indicating child sexual abuse material, regarding the quantity and nature of that material and regarding the verifications by the EU Centre and the audits referred to in Article 36(2) and Article 46(7), respectively;deleted
2023/05/08
Committee: FEMM
Amendment 337 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2 – point d
(d) request any other relevant public authority or relevant experts or entities to provide the necessary information.deleted
2023/05/08
Committee: FEMM
Amendment 338 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2 – point d a (new)
(da) the content is hosted outside of the European Union or territories under the jurisdiction of its member states, by an entity that has no legal representative in the European Union or territories under the jurisdiction of its member states;
2023/05/08
Committee: FEMM
Amendment 339 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2 – point d b (new)
(db) the coordinating authority, EU centre and national law enforcement organisations have taken all possible measures to have the content removed, including: i. contacting the hosting service where the material is stored in order to request removal; ii. contacting law enforcement in the country where the content is hosted to request their assistance in removing the material;
2023/05/08
Committee: FEMM
Amendment 340 #
Proposal for a regulation
Article 16 – paragraph 3
3. The Coordinating Authority of establishment shall, before requesting the issuance of the blocking order, inform the provider of its intention to request the issuance of the blocking order, specifying the main elements of the content of the intended blocking order and the reasons to request the blocking order. It shall afford the provider an opportunity to comment on that information, within a reasonable time period set by that Coordinating Authority.deleted
2023/05/08
Committee: FEMM
Amendment 341 #
Proposal for a regulation
Article 16 – paragraph 4 – subparagraph 1
The Coordinating Authority of establishment shall request the issuance of the blocking order, and the competent judicial authority or independent authority shall issue the blocking order, where it considers that the following conditions are met: (a) there is evidence of the service having been used during the past 12 months, to an appreciable extent, for accessing or attempting to access the child sexual abuse material indicated by the uniform resource locators; (b) the blocking order is necessary to prevent the dissemination of the child sexual abuse material to users in the Union, having regard in particular to the quantity and nature of that material, the need to protect the rights of the victims and the existence and implementation by the provider of a policy to address the risk of such dissemination; (c) the uniform resource locators indicate, in a sufficiently reliable manner, child sexual abuse material; (d) the reasons for issuing the blocking order outweigh negative consequences for the rights and legitimate interests of all parties affected, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties, including the exercise of the users’ freedom of expression and information and the provider’s freedom to conduct a business.deleted
2023/05/08
Committee: FEMM
Amendment 342 #
Proposal for a regulation
Article 16 – paragraph 4 – subparagraph 1 – point a
(a) there is evidence of the service having been used during the past 12 months, to an appreciable extent, for accessing or attempting to access the child sexual abuse material indicated by the uniform resource locators;deleted
2023/05/08
Committee: FEMM
Amendment 343 #
Proposal for a regulation
Article 16 – paragraph 4 – subparagraph 1 – point b
(b) the blocking order is necessary to prevent the dissemination of the child sexual abuse material to users in the Union, having regard in particular to the quantity and nature of that material, the need to protect the rights of the victims and the existence and implementation by the provider of a policy to address the risk of such dissemination;deleted
2023/05/08
Committee: FEMM
Amendment 344 #
Proposal for a regulation
Article 16 – paragraph 4 – subparagraph 1 – point c
(c) the uniform resource locators indicate, in a sufficiently reliable manner, child sexual abuse material;deleted
2023/05/08
Committee: FEMM
Amendment 345 #
Proposal for a regulation
Article 16 – paragraph 4 – subparagraph 1 – point d
(d) the reasons for issuing the blocking order outweigh negative consequences for the rights and legitimate interests of all parties affected, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties, including the exercise of the users’ freedom of expression and information and the provider’s freedom to conduct a business.deleted
2023/05/08
Committee: FEMM
Amendment 346 #
Proposal for a regulation
Article 16 – paragraph 4 – subparagraph 2
When assessing whether the conditions of the first subparagraph have been met, account shall be taken of all relevant facts and circumstances of the case at hand, including any information obtained pursuant to paragraph 2 and the views of the provider submitted in accordance with paragraph 3.deleted
2023/05/08
Committee: FEMM
Amendment 347 #
Proposal for a regulation
Article 16 – paragraph 5
5. The Coordinating Authority of establishment when requesting the issuance of blocking orders, and the competent judicial or independent administrative authority when issuing the blocking order, shall: (a) specify effective and proportionate limits and safeguards necessary to ensure that any negative consequences referred to in paragraph 4, point (d), remain limited to what is strictly necessary; (b) subject to paragraph 6, ensure that the period of application remains limited to what is strictly necessary.deleted
2023/05/08
Committee: FEMM
Amendment 348 #
Proposal for a regulation
Article 16 – paragraph 5 – point a
(a) specify effective and proportionate limits and safeguards necessary to ensure that any negative consequences referred to in paragraph 4, point (d), remain limited to what is strictly necessary;deleted
2023/05/08
Committee: FEMM
Amendment 349 #
Proposal for a regulation
Article 16 – paragraph 5 – point b
(b) subject to paragraph 6, ensure that the period of application remains limited to what is strictly necessary.deleted
2023/05/08
Committee: FEMM
Amendment 350 #
Proposal for a regulation
Article 16 – paragraph 6 – subparagraph 1
The Coordinating Authority shall specify in the blocking order the period during which it applies, indicating the start date and the end date.deleted
2023/05/08
Committee: FEMM
Amendment 351 #
Proposal for a regulation
Article 16 – paragraph 6 – subparagraph 2
The period of application of blocking orders shall not exceed five years.deleted
2023/05/08
Committee: FEMM
Amendment 352 #
Proposal for a regulation
Article 16 – paragraph 6 – subparagraph 2
The period of application of blocking orders shall not exceed five yearstwo years. Should the Coordinating Authority wish to renew or extend blocking, they must show that renewed attempts have been made to have the content removed within four months of the renewal or extension.
2023/05/08
Committee: FEMM
Amendment 353 #
Proposal for a regulation
Article 16 – paragraph 7 – subparagraph 1
In respect of the blocking orders that the competent judicial authority or independent administrative authority issued at its request, the Coordinating Authority shall, where necessary and at least once every year, assess whether any substantial changes to the grounds for issuing the blocking orders occurred and, in particular, whether the conditions of paragraph 4 continue to be met.deleted
2023/05/08
Committee: FEMM
Amendment 353 #
Proposal for a regulation
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficientthe provider refuses to cooperate by putting in place the mitigating measures aimed to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request, as a measure of last resort, the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services. Such detection orders shall be issued with regards to the technical capacity of the provider, and shall in no way be intrepreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encrypted content and communications.
2023/07/28
Committee: LIBE
Amendment 354 #
Proposal for a regulation
Article 16 – paragraph 7 – subparagraph 2
That Coordinating Authority shall request to the competent judicial authority or independent administrative authority that issued the blocking order the modification or revocation of such order, where necessary in the light of the outcome of that assessment or to take account of justified requests or the reports referred to in Article 18(5) and (6), respectively. The provisions of this Section shall apply to such requests, mutatis mutandis.deleted
2023/05/08
Committee: FEMM
Amendment 372 #
Proposal for a regulation
Article 18
[...]deleted
2023/05/08
Committee: FEMM
Amendment 373 #
Proposal for a regulation
Article 18 – paragraph 1
1. Providers of internet access services that have received a blocking order, as well as users who provided or were prevented from accessing a specific item of material indicated by the uniform resource locators in execution of such orders, shall have a right to effective redress. That right shall include the right to challenge the blocking order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the blocking order.deleted
2023/05/08
Committee: FEMM
Amendment 373 #
Proposal for a regulation
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted and specifiedjustified, proportionate and related only to an identifiable part of the specific service, user or group of users, as well as targeted and limited in time so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
2023/07/28
Committee: LIBE
Amendment 374 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1
When the blocking order becomes final, the competent judicial authority or independent administrative authority that issued the blocking order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).deleted
2023/05/08
Committee: FEMM
Amendment 375 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 2
For the purpose of the first subparagraph, a blocking order shall become final upon the expiry of the time period for appeal where no appeal has been lodged in accordance with national law or upon confirmation of the removal order following an appeal.deleted
2023/05/08
Committee: FEMM
Amendment 376 #
Proposal for a regulation
Article 18 – paragraph 3
3. The provider shall establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section. It shall process such complaints in an objective, effective and timely manner.deleted
2023/05/08
Committee: FEMM
Amendment 377 #
Proposal for a regulation
Article 18 – paragraph 4
4. Where a provider prevents users from accessing the uniform resource locators pursuant to a blocking order issued in accordance with Article 17, it shall take reasonable measures to inform the users of the following: (a) the fact that it does so pursuant to a blocking order; (b) the reasons for doing so, providing, upon request, a copy of the blocking order; (c) the users’ right of judicial redress referred to in paragraph 1, their rights to submit complaints to the provider through the mechanism referred to in paragraph 3 and to the Coordinating Authority in accordance with Article 34, as well as their right to submit the requests referred to in paragraph 5.deleted
2023/05/08
Committee: FEMM
Amendment 378 #
Proposal for a regulation
Article 18 – paragraph 4 – point a
(a) the fact that it does so pursuant to a blocking order;deleted
2023/05/08
Committee: FEMM
Amendment 379 #
Proposal for a regulation
Article 18 – paragraph 4 – point b
(b) the reasons for doing so, providing, upon request, a copy of the blocking order;deleted
2023/05/08
Committee: FEMM
Amendment 380 #
Proposal for a regulation
Article 18 – paragraph 4 – point c
(c) the users’ right of judicial redress referred to in paragraph 1, their rights to submit complaints to the provider through the mechanism referred to in paragraph 3 and to the Coordinating Authority in accordance with Article 34, as well as their right to submit the requests referred to in paragraph 5.deleted
2023/05/08
Committee: FEMM
Amendment 381 #
Proposal for a regulation
Article 18 – paragraph 5 – subparagraph 1
The provider and the users referred to in paragraph 1 shall be entitled to request the Coordinating Authority that requested the issuance of the blocking order to assess whether users are wrongly prevented from accessing a specific item of material indicated by uniform resource locators pursuant to the blocking order. The provider shall also be entitled to request modification or revocation of the blocking order, where it considers it necessary due to substantial changes to the grounds for issuing the blocking orders that occurred after the issuance thereof, in particular substantial changes preventing the provider from taking the required reasonable measures to execute the blocking order,deleted
2023/05/08
Committee: FEMM
Amendment 382 #
Proposal for a regulation
Article 18 – paragraph 5 – subparagraph 2
The Coordinating Authority shall, without undue delay, diligently assess such requests and inform the provider or the user submitting the request of the outcome thereof. Where it considers the request to be justified, it shall request modification or revocation of the blocking order in accordance with Article 16(7) and inform the EU Centre.deleted
2023/05/08
Committee: FEMM
Amendment 383 #
Proposal for a regulation
Article 18 – paragraph 6
6. Where the period of application of the blocking order exceeds 24 months, the Coordinating Authority of establishment shall require the provider to report to it on the measures taken to execute the blocking order, including the safeguards provided for, at least once, halfway through the period of application.deleted
2023/05/08
Committee: FEMM
Amendment 383 #
Proposal for a regulation
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use ofIn accordance with Article 6a, nothing in this regulation shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encryptied con technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of childrennt or communications through client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provides third party actors with access to the end-to-end encrypted content and communications. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
2023/07/28
Committee: LIBE
Amendment 386 #
Proposal for a regulation
Article 20 – title
20 Victims’ right to information and support
2023/05/08
Committee: FEMM
Amendment 387 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
Persons residingVictims of child sexual abuse material hosted or disseminated in the Union or their representatives and persons in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they reside,live or Coordinating Authority of their choosing, easily understandable and accessible information regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12. The request can cover both an occasional request as well as a periodic request. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them., and the information in question should be given on the basis of the indicated language by that person
2023/05/08
Committee: FEMM
Amendment 389 #
Proposal for a regulation
Recital 26 a (new)
(26a) End-to-end encryption is an essential tool to guarantee the security, privacy and confidentiality of the communications between users, including those of children. Any weakening of the end-to-end encryption's effect could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communications shall be understood the processing of any data, that would compromise or put at risk the integrity and confidentiality of the aforementioned end-to-end encrypted content. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side-channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provide third party actors access to the end-to-end encrypted content and communications.
2023/07/28
Committee: LIBE
Amendment 390 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1 a (new)
Victims of child sexual abuse or their representatives and persons living in the Union shall have the right to receive, upon their request, from the Coordinating Authority information regarding victim’s rights, support and assistance. The information shall be age-appropriate, accessible and gender-sensitive and shall include at a minimum: (a) the type of support they can obtain and from whom, including, where relevant, basic information about access to medical support, any specialist support, including psychological or social support, and alternative accommodation; (b) the procedures for making complaints with regard to a criminal offence and their role in connection with such procedures; ( c) how and under what conditions they can obtain protection, including protection measures; (d) how and under what conditions they can access legal advice, legal aid and any other sort of advice; (e) how and under what conditions they can access compensation; (f) how and under what conditions they are entitled to interpretation and translation;
2023/05/08
Committee: FEMM
Amendment 392 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1 b (new)
In case a victim or victim representative indicates the preference for a periodic request, the Coordinating Authority shall submit without delay, the information referred to in paragraph 3 proactively to the requester after the first submitted reply, in any new instances of reports referred to in paragraph 1 on a weekly basis. Victims or victim representatives can terminate the periodic request at any time by notifying the Coordinating Authority in question.
2023/05/08
Committee: FEMM
Amendment 394 #
Proposal for a regulation
Article 20 – paragraph 2 – point b
(b) where applicable, the individual or entity that is to receive the information on behalfformally assisting or representing the person that is to receive the information on behalf of the person making the request with verifiable proof of approval of the person making the request;
2023/05/08
Committee: FEMM
Amendment 395 #
(c) sufficient elements to demonstrverify thate the identity ofchild sexual abuse material in question matches with the person making the request.
2023/05/08
Committee: FEMM
Amendment 396 #
Proposal for a regulation
Article 20 – paragraph 2 – subparagraph 1 (new)
(d) an indication if the request is occasional or should cover a certain time period
2023/05/08
Committee: FEMM
Amendment 397 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
(a) the identification of the provider(s) that submitted the report;
2023/05/08
Committee: FEMM
Amendment 398 #
Proposal for a regulation
Article 20 – paragraph 3 – point b
(b) the date(s) of the report(s);
2023/05/08
Committee: FEMM
Amendment 399 #
Proposal for a regulation
Article 20 – paragraph 3 – point c
(c) whether the EU Centre forwarded the report(s) in accordance with Article 48(3) and, if so, to which authorities;
2023/05/08
Committee: FEMM
Amendment 400 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
(d) whether the provider reported having removed or disabled access to the material, and in case, including all related information, in accordance with Article 13(1), point (i).
2023/05/08
Committee: FEMM
Amendment 402 #
Proposal for a regulation
Article 20 – paragraph 3 – subparagraph 1 (new)
(e) if there were appeals to such removal and all related information (f) new relevant age-appropriate, accessible and gender-sensitive information on victim support and assistance in the victim’s region
2023/05/08
Committee: FEMM
Amendment 404 #
Proposal for a regulation
Article 21 – paragraph 1
1. Providers of hosting services shall provide reasonablewithout delay, assistance, on request, to persons residingvictims of child sexual abuse material hosted or disseminated in the Union or their representatives or persons in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
2023/05/08
Committee: FEMM
Amendment 406 #
Proposal for a regulation
Article 21 – paragraph 1 a (new)
1a. Professionals likely to come into contact with victims of child sexual abuse shall be adequately trained to deal with such victims, taking into account gender sensitivities.
2023/05/08
Committee: FEMM
Amendment 407 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Persons residingVictims of child sexual abuse material hosted or disseminated in the Union or their representatives or persons in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting or the Coordinating Authority of their choosing, age appropriate and gender-sensitive information on support for removal, including support from civil society organisations, hotlines and from the EU Centre when they seek to have a provider of hosting services or publicly available number-independent interpersonal communications services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
2023/05/08
Committee: FEMM
Amendment 410 #
Proposal for a regulation
Article 21 – paragraph 3
3. The requests referred to in paragraphs 1 and 2 shall indicate the relevant item or items of child sexual abuse material and any other relevant information.
2023/05/08
Committee: FEMM
Amendment 411 #
Proposal for a regulation
Article 21 – paragraph 4 – point b
(b) verifying whether and when the provider removed or disabled access to that item or those items, including by conducting the searches referred to in Article 49(1);
2023/05/08
Committee: FEMM
Amendment 412 #
Proposal for a regulation
Article 21 – paragraph 4 – point d
(d) where necessary, informing the Coordinating Authority of establishment of the presence of that item or those items on the provider’s service, with a view to the issuance of a removal order pursuant to Article 14 and the obligations under Article 21.
2023/05/08
Committee: FEMM
Amendment 413 #
Proposal for a regulation
Article 21 – paragraph 4 – subparagraph 1 (new)
(e) information regarding victim’s rights, assistance and support pursuant to Article 21.
2023/05/08
Committee: FEMM
Amendment 414 #
Proposal for a regulation
Article 21 – paragraph 4 a (new)
4a. The EU Centre shall provide a “Take it Down” service which: allows victims to flag abuse material depicting them, and store a fingerprint of that material in a database and allows participating interpersonal communications services and hosting services, including social networks, to voluntarily check images uploaded to their platforms against this database. Participating services shall: (a). take the following measures when a match is found: (i). inform the uploader that the image they are attempting to upload has been identified as child sexual abuse material, and prevent upload; (ii). give the uploader the option to contest the flagging, forwarding the image and fingerprint on to the EU centre for further analysis; (iii). allow the uploader to provide further information to the EU Centre on the origin of the image. (b). state clearly that uploads are checked against a database of known abuse material; (c). provide anonymised statistics to the EU centre on the number of times an upload of an image with a certain hash was attempted.
2023/05/08
Committee: FEMM
Amendment 418 #
Proposal for a regulation
Article 25 – paragraph 1
1. Member States shall, by [Date - two months from the date of entry into force of this Regulation], designate one or more competent authorities as responsible for the application and enforcement of this Regulation and to the achievement of the objective of this Regulation and enforcement of Directive 2011/93/EU (‘competent authorities’).
2023/05/08
Committee: FEMM
Amendment 419 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 2
The Coordinating Authority shall be responsible for all matters related to application and enforcement of this Regulation, and to the achievement of the objective of this Regulation and enforcement of Directive 2011/93/EU in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities.
2023/05/08
Committee: FEMM
Amendment 420 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 3
The Coordinating Authority shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective, efficient and consistent application and enforcement of this Regulation and Directive 2011/93/EU throughout the Union.
2023/05/08
Committee: FEMM
Amendment 422 #
Proposal for a regulation
Article 25 – paragraph 5
5. Each Member State shall ensure that a sufficiently staffed contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation and enforcement of Directive 2011/93/EU in that Member State. Member States shall make the information on the contact point publicly available and communicate itwidely accessible through gender-sensitive and age-appropriate online and offline awareness raising campaigns and communicate this information to the EU Centre. They shall keep that information updated.
2023/05/08
Committee: FEMM
Amendment 425 #
Proposal for a regulation
Article 25 – paragraph 7 – point a
(a) provide certain information or technical expertise on matters covered by this Regulation;
2023/05/08
Committee: FEMM
Amendment 427 #
Proposal for a regulation
Article 25 – paragraph 7 – point a a (new)
(aa) provide information and expertise on gender-sensitive and age appropriate victim support and prevention of online child sexual abuse
2023/05/08
Committee: FEMM
Amendment 429 #
Proposal for a regulation
Article 25 – paragraph 7 – point c
(c) verify the possible need to request competent nationjudicial authorities to issue a detecn investigation order, or a removal order or a blocking order in respect of a service under the jurisdiction of the Member State that designated that Coordinating Authority;
2023/05/08
Committee: FEMM
Amendment 430 #
Proposal for a regulation
Article 25 – paragraph 7 – point d
(d) verify the effectiveness of a detecn investigation order or a removal order issued upon the request of the requesting Coordinating Authority or judicial authorities.
2023/05/08
Committee: FEMM
Amendment 431 #
Proposal for a regulation
Article 25 – paragraph 8
8. The EU Centre shall provide such assistance free of charge and in accordance with its tasks and obligations under this Regulation and insofar as its resources and priorities allow.
2023/05/08
Committee: FEMM
Amendment 435 #
Proposal for a regulation
Article 26 – paragraph 1
1. Member States shall ensure that the 1. Coordinating Authorities that they designated perform their tasks under this Regulation in an objective, impartial, transparent and timely manner, while fully respecting theall fundamental rights of all parties affected. They shall also ensure that their Coordinating Authorities perform their tasks with utmost respect and sensitivity towards victims and their representatives, with a focus on avoidance of re-victimization, the safety of the victim and their needs. Member States shall also ensure that their Coordinating Authorities have adequate technical, financial and human resources to carry out their tasks.
2023/05/08
Committee: FEMM
Amendment 436 #
Proposal for a regulation
Article 26 – paragraph 2 – point a
(a) are legally and functionally independent from any other public authorityies;
2023/05/08
Committee: FEMM
Amendment 437 #
Proposal for a regulation
Article 26 – paragraph 2 – point e
(e) are not charged with tasks relating to the prevention or combating of child sexual abuse, other than their tasks under this Regulation.deleted
2023/05/08
Committee: FEMM
Amendment 438 #
Proposal for a regulation
Article 26 – paragraph 3 a (new)
3a. Paragraph 2 shall not prevent the Coordinating Authority of any membership in a recognised international network as it shall not prejudice its independent character;
2023/05/08
Committee: FEMM
Amendment 439 #
Proposal for a regulation
Article 26 – paragraph 4
4. The Coordinating Authorities shall ensure that relevant members of staff have the required qualifications, experience and technical skills to perform their duties. They shall also ensure that members of staff coming into contact with victims are adequately and frequently trained in intersectional victim support.
2023/05/08
Committee: FEMM
Amendment 440 #
Proposal for a regulation
Article 26 – paragraph 4 a (new)
4a. The Coordinating Authorities shall ensure that the appointment of management and hiring of staff is subject to an employment background check,
2023/05/08
Committee: FEMM
Amendment 441 #
Proposal for a regulation
Article 26 – paragraph 5
5. The management and other staff of the Coordinating Authorities shall, in accordance with Union or national law, be subject to a duty of professional secrecy both during and after their term of office, with regard to any confidential information which has come to their knowledge in the course of the performance of their tasks. Member States shall ensure that the management and other staff are subject to rules guaranteeing that they can carry out their tasks in an objective, impartial and independent manner, in particular as regards their appointment, dismissal, remuneration and career prospects. Coordinating Authorities shall take into account the application of Directive 2021/93/EU on Pay Transparency.
2023/05/08
Committee: FEMM
Amendment 442 #
Proposal for a regulation
Article 34 – paragraph 1
1. Users shall have the right to lodge a complaint alleging an infringement of this Regulation affecting them against providers of relevant information society services, including through an individual or entity formally assisting or representing them , alleging an infringement of this Regulation or infringements of their fundamental rights resulting from this Regulation with the Coordinating Authority designated by the Member State where the user resides or is established. or by a Coordinating Authority of their choosing
2023/05/08
Committee: FEMM
Amendment 443 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a. The Coordinating authority shall offer easy to use mechanisms to anonymously submit information about infringements of this Regulation"
2023/05/08
Committee: FEMM
Amendment 444 #
Proposal for a regulation
Article 34 – paragraph 2
2. Coordinating Authorities shall provide child-friendlyage-appropriate and accessible mechanisms to submit a complaint under this Article and adopt a childn age-appropriate and gender-sensitive approach when handling complaints submitted by children, taking due account of the child's age, maturityperson’s age, if indicated, views, needs and concerns. The processing of complaints shall take into account due diligence and will provide necessary information to the complainant.
2023/05/08
Committee: FEMM
Amendment 445 #
Proposal for a regulation
Article 34 – paragraph 3 – subparagraph 1 a (new)
Users shall have the right to be informed of the outcome of the complaint.
2023/05/08
Committee: FEMM
Amendment 477 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 a (new)
(6a) facilitate and coordinate cooperation, including information sharing, with international law enforcement organisations, law enforcement authorities in third countries, in respect of Data Protection rules;
2023/05/08
Committee: FEMM
Amendment 478 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1). This shall include technologies both for preventative and proactive detection orders.
2023/05/08
Committee: FEMM
Amendment 496 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market. by persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
2023/07/28
Committee: LIBE
Amendment 499 #
Proposal for a regulation
Article 56 – paragraph 1
1. The Management Board shall be composed of one representative from each Member State and, two representatives of the Commission,a representative of the European Parliament, and a representative from the European Data Protection Board, all as members with voting rights.
2023/05/08
Committee: FEMM
Amendment 502 #
Proposal for a regulation
Article 56 – paragraph 2 – subparagraph 1
The Management Board shall also include one independent expert observer designated by the European Parliament, without the right to vote.deleted
2023/05/08
Committee: FEMM
Amendment 504 #
Proposal for a regulation
Article 56 – paragraph 2 – subparagraph 2
Europol mayshall designate a representative to attend the meetings of the Management Board as an observer on matters involving Europol, at the request of the Chairperson of the Management Board.
2023/05/08
Committee: FEMM
Amendment 524 #
Proposal for a regulation
Article 66 – paragraph 1
1. The Technology Committee shall consist of technical experts appointed by the Management Board in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union, as well as a representative from the European Data Protection Board.
2023/05/08
Committee: FEMM
Amendment 526 #
Proposal for a regulation
Article 66 a (new)
Article66a Establishment and tasks of the Survivors Advisory Board 1. The Survivors’ Advisory Board shall consist of seven members who are either survivors and victims of child sexual abuse or experts on the needs of survivors and victims of child sexual abuse, and shall be appointed by the Management Board in view of their personal experience if applicable, expertise and scope of work, following the publication of a call for expressions of interest in the Official Journal of the European Union. The Survivors Advisory Board shallachieve a gender balance as well as ensure representation of all protected characteristics. 2. Procedures concerning the appointment of the members of the Survivors’ Advisory Board and its operation shall be further specified in the rules of procedure of the Management Board and shall be made public. 3. The members of the Survivors’ Advisory Board shall act in the interest of child sexual abuse victims. The EU Agency shall publish the list of members of the Survivors’ Advisory Board on its website and keep it up to date. 4. If a member no longer meets the criterion of independence, he or she shall inform the Management Board. The Management Board may, on the proposal of at least one third of its members or of the Commission, determine a lack of independence and revoke the appointment of the person concerned. The Management Board shall appoint a new member for the remaining term of office in accordance with the procedure applicable to ordinary members. If a member resigns before the expiry of his or her term of office, he or she shall be replaced for the remaining term of office in accordance with the procedure applicable to ordinary members. 5. The term of office of the members of the Survivors’ Advisory Board shall be four years. It may be renewed once. 6. The Executive Director and the Management Board shall consult the Survivors Advisory Board on any matter relating to victims rights and preventing and combating child sexual abuse, and they shall give a structural consult at least twice a year. 7. The Survivors’ Advisory Board shall have the following tasks: (a) ensure visibility of the interests and needs of survivors and victims of child sexual abuse; (b) advise the Management Board on matters set out in Article 57 point (h a); (c) advise the Executive Director and the Management Board as set out in paragraph 6 of this Article; (d) contribute experience and expertise in preventing and combating child sexual abuse and victim support and assistance; (e) serve as a platform to exchange and connect for survivors of child sexual abuse; (f) provide an annual activity report to the Executive Director as part of the Consolidated Annual Activity Report.
2023/05/08
Committee: FEMM
Amendment 594 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) “person suspected of being involved in child sexual abuse” means an identified individual person about whom verifiable adequate evidence exists, which gives rise to the suspicion that that person has committed a child sexual abuse offence, attempted to commit a child sexual abuse offence, or prepared by committing a criminal offence to commit a child sexual abuse offence;
2023/07/28
Committee: LIBE
Amendment 596 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(qb) 'person disqualified from exercising activities involving children' means an identified individual person, who, in line with Article 10 of Directive 2011/93/EU, is temporarily or permanenently disqualified from exercising activities involving direct and regular contacts with children;
2023/07/28
Committee: LIBE
Amendment 695 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
2a. The provider, where applicable, shall assess, in a separate section of its risk assessment, the voluntary use of specific technologies for the processing of personal and other data to the extent strictly necessary to detect, to report and to remove online child sexual abuse material from its services. Such voluntary use of specific technologies shall under no circumstances undermine the integrity and confidentiality of end-to-end encrypted content and communcations.
2023/07/28
Committee: LIBE
Amendment 807 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take threasonable and proportionate mitigation measures.
2023/07/28
Committee: LIBE
Amendment 861 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessinginform the software application provider concerned and the EU Centre about the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;
2023/07/28
Committee: LIBE
Amendment 868 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
(c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b).deleted
2023/07/28
Committee: LIBE
Amendment 870 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
1a. Providers of software applications who have been informed that in relation to their software applications a significant risk of use of the service concerned for the purpose of the solicitation of children has been identified, shall take reasonable and proportionate mitigation measures.
2023/07/28
Committee: LIBE
Amendment 875 #
Proposal for a regulation
Article 6 a (new)
Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communcations shall be understood the processing of any data that would compromise or put at risk the integrity and confidentiality of the content and communications in the end- to-end encryption. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communications services provides third party actors access to the end-to-end encrypted content.
2023/07/28
Committee: LIBE
Amendment 890 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service in the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
2023/07/28
Committee: LIBE
Amendment 1017 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall, in accordance with Article 8 of Regulation (EU) 2022/2065, target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b),2 remain limited to what is strictly necessary, justifiable and proportionate to effectively address the significant risk referred to in point (a) thereof, and limit the detection order to an identifiable part or component of a service, such as a specific channel of communication or a specific group of users identified with particularity for which the significant risk has been identified. In accordance with Article 6a, no such detection order shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encrypted content and communications.
2023/07/28
Committee: LIBE
Amendment 1128 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that have received a detection order concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
2023/07/28
Committee: LIBE
Amendment 1266 #
Proposal for a regulation
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1)courts identified as constituting child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1294 #
Proposal for a regulation
Chapter II – Section 5
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1332 #
Proposal for a regulation
Article 19 a (new)
Article19a Respect to Privacy Nothing in this Regulation shall be interpreted as a requirement to 1. break cryptography; 2. scan content on users’ devices; 3. restrict anonymous access to online services and software applications.
2023/07/28
Committee: LIBE
Amendment 1698 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1) concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
2023/07/28
Committee: LIBE
Amendment 1701 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
To that aim, the EU Centre shall compile lists of such technologies, having regard to the requirements of this Regulation and in particular those of Article 10(2) and Article 19a (new).
2023/07/28
Committee: LIBE