BETA

70 Amendments of Michal ŠIMEČKA related to 2022/0155(COD)

Amendment 46 #
Proposal for a regulation
Citation 6 a (new)
Having regard to the complimentary impact assessment37aof the European Parliament, _________________ 37a PE 740.248 https://www.europarl.europa.eu/RegData/ etudes/STUD/2023/740248/EPRS_STU(2 023)740248_EN.pdf
2023/05/08
Committee: FEMM
Amendment 48 #
Proposal for a regulation
Recital 1
(1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. While the proportion of child sexual abuse material that affects boys is growing, Child sexual abuse has a disproportionate impact on girls as the vast majority of child sexual abuse material is depicting girls, and girls are overrepresented in cases of solicitation of children, while men are overrepresented as perpetrators. According to reports, 96% of child sexual abuse material is estimated in 2021 to have affected girls. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), in line with the United Nations Convention on the Rights of the Child (UNCRC), which has been ratified by all Member States, and to protect society at large. Users of such services offered in the Union should be able to trust that the services concerned can be used safely, especially by children.
2023/05/08
Committee: FEMM
Amendment 53 #
Proposal for a regulation
Recital 1 a (new)
(1a) A growing number of teenagers are sharing intimate images, despite this being prohibited in a majority of member states. The implementation of measures to detect new abuse material would inevitably flag all such images as abuse material, resulting in a large number of false positives, but also in the investigation of those teenagers. This would significantly infringe on children’s right to privacy, as guaranteed by the Charter of Fundamental Rights of the European Union (‘Charter’), and in line with the United Nations Convention on the Rights of the Child (UNCRC), which has been ratified by all Member States. It would also result in stigma that disproportionately affects girls37b. Therefore services should warn children about the risks of sharing images, and give them guidance on what to do if they do so and something goes wrong. _________________ 37b The outcomes of sexting for children and adolescents: A systematic review of the literature https://doi.org/10.1016/j.adolescence.2021 .08.009
2023/05/08
Committee: FEMM
Amendment 56 #
Proposal for a regulation
Recital 1 b (new)
(1b) Often, teenagers are manipulated into sharing images, or consensually share images which are later shared without their consent. This proposal should provide teenagers with tools to help prevent images from being shared without their consent, in particular through the possibility to submit the image to an EU "take it down" service, which prevents the image from being uploaded to social media websites.
2023/05/08
Committee: FEMM
Amendment 57 #
Proposal for a regulation
Recital 1 c (new)
(1c) The use of software to detect solicitation of children is insufficiently accurate, which means it could result in false positives, or could inadvertently flag child-to-child communications. This poses significant risks, in particular to LGBTQI+ children in hostile households.
2023/05/08
Committee: FEMM
Amendment 58 #
Proposal for a regulation
Recital 1 d (new)
(1d) The combined risks of attempting to detect unknown abuse material and solicitation pose a significant risk to children, and these technologies are also vulnerable to being bypassed by abusers, rendering them ineffective, therefore this legislation should focus on detecting known content, as well as flagging potential solicitation to the child user in an age-appropriate manner, and reducing creation and sharing of self-generated material.
2023/05/08
Committee: FEMM
Amendment 68 #
Proposal for a regulation
Recital 4 a (new)
(4a) The existence of Child Sexual Abuse Material implies that child sexual abuse has already taken place. Detecting abuse material is important, but prevention is also vital. Therefore, member states should significantly strengthen educational measures to help children, teachers, and social services, to identify and report abuse, in particular by teaching children about consent from the earliest age possible, albeit in an age- appropriate manner.
2023/05/08
Committee: FEMM
Amendment 70 #
Proposal for a regulation
Recital 4 b (new)
(4b) Many of the online risks associated with child abuse continue to pose a threat to adults, and many adults have already fallen victim, therefore this regulation should also focus on prevention of online risks, mandating the integration into applications of features that help children learn about, identify and avoid risks, making use of a "learning through doing" approach.
2023/05/08
Committee: FEMM
Amendment 72 #
Proposal for a regulation
Recital 4 c (new)
(4c) The internet is an empowering and beneficial resource for children, allowing them to socialise, learn and play, however it can also pose significant risks. Many online services have set limits on the features accessible to children in order to mitigate these risks, however often depriving children of these features encourages them to lie about their age, or to try to evade age-verification systems. Therefore, rather than prohibiting access, services should focus on adapting their features and implementing safeguards for children.
2023/05/08
Committee: FEMM
Amendment 74 #
Proposal for a regulation
Recital 4 d (new)
(4d) Developers should focus on responsibility by design, with the goal of preventing abuse, developing risk- mitigation and safety features for applications. To achieve this, it is important that developers understand how children use their services, and the threats they face. Therefore, children should be involved in the development process of risk-mitigation and safety features that are built for them.
2023/05/08
Committee: FEMM
Amendment 78 #
Proposal for a regulation
Recital 9 a (new)
(9a) Case law of the European Court of Justice43ahas repeatedly found indiscriminate monitoring of Communications is incompatible with the Charter of Fundamental Rights of the European Union, therefore detection orders should be targeted to individuals or groups suspected of child sexual abuse, and not at the wider population. _________________ 43a Cases C-511/18, C-512/18, C-520/18, and C-623/17 Court of Justice of the European Union
2023/05/08
Committee: FEMM
Amendment 79 #
Proposal for a regulation
Recital 13
(13) The term ‘online child sexual abuse’ should cover not only the dissemination of material previously detected and confirmed as constituting child sexual abuse material (‘known’ material), but also of material not previously detected that is likely to constitute child sexual abuse material but that has not yet beenbut since confirmed as such (‘new’ material), as well as activities constituting the solicitation of children (‘grooming’). That is needed in order to address not only past abuse, the re- victimisation and violation of the victims’ rights it entails, such as those to privacy and protection of personal data, but to also address recent, ongoing and imminent abuse, so as to prevent it as much as possible, to effectively protect children and to increase the likelihood of rescuing victims and stopping perpetrators.
2023/05/08
Committee: FEMM
Amendment 83 #
Proposal for a regulation
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation. Providers should also assess the reasonably foreseeable negative impacts of proposed mitigation measures, and if they disproportionately affect a group of people on the basis of sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender or sexual orientation. Particular care should be taken to assess the impact on girls, who are at a greater risk of being subject to child sexual abuse.
2023/05/08
Committee: FEMM
Amendment 84 #
Proposal for a regulation
Recital 16 a (new)
(16a) Parental controls that allow parents to access children’s private correspondence without their consent pose a significant risk to children’s privacy, but could also put at risk their safety, in particular in the cases of children who are being abused and who are trying to seek help, and LGBTQI+ children in hostile households. Therefore no provision in this legislation should enable or facilitate intrusions on children’s privacy.
2023/05/08
Committee: FEMM
Amendment 86 #
Proposal for a regulation
Recital 17
(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigation measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identified and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect or prevent online child sexual abuse in their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authority.
2023/05/08
Committee: FEMM
Amendment 89 #
Proposal for a regulation
Recital 19
(19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable measures to assess and mitigate that risk. The providers should make that assessment in a diligent manner, making efforts that are reasonable under the given circumstances, having regard inter alia to the nature and extent of that risk as well as their financial and technological capabilities and size, and cooperating with the providers of the services offered through the software application where possibleprovide parents with information on the features in applications that present a risk to children, as well as age and gender- sensitive guidance on how to discuss those risks with children.
2023/05/08
Committee: FEMM
Amendment 91 #
Proposal for a regulation
Recital 19 a (new)
(19a) Regulation (EU) 2022/1925 (the Digital Markets Act) sets out provisions to ensure competition in mobile device ecosystems, which would allow citizens to install software on their mobile devices directly, without using software application stores, bypassing age verification at the level of software application stores. Therefore manufacturers of operating systems deemed as gatekeepers under the Digital Markets Act should provide an application programming interface through which applications can request age verification, either through the European Digital Identity Wallet as defined in Regulation (EU) No XXX/2023 establishing a framework for a European Digital Identity, or through a third-party service. Manufacturers of operating systems deemed as gatekeepers should also provide a service to process age- verification requests in a manner that respects the privacy of the user and does not store a record of the services they accessed.
2023/05/08
Committee: FEMM
Amendment 93 #
Proposal for a regulation
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards, and targeted only to individuals suspected of child sexual abuse . For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.
2023/05/08
Committee: FEMM
Amendment 96 #
Proposal for a regulation
Recital 22
(22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected, in particular girls, ethnic and sexual minorities. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
2023/05/08
Committee: FEMM
Amendment 97 #
Proposal for a regulation
Recital 24
(24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, should be in a position to take a well-informed decision on requests for the issuance of detections orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especially in connection to detection orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question. They should do so as soon as possible, having regard to the important public policy objective at stake and the need to act without undue delay to protect children. In particular, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for providinge their opinions in response to a prior consultation. Furthermore, they should normally be able to provide their opinion well within that time period in situations where the European Data Protection Board has already issued guidelines regarding the technologies that a provider envisages deploying and operating to execute a detection order addressed to it under this Regulation as quickly as possible.
2023/05/08
Committee: FEMM
Amendment 101 #
Proposal for a regulation
Recital 26 a (new)
(26a) Detection of child sexual abuse in end-to-end encrypted communications is only possible by scanning those communications before they leave the abuser's device, however this would allow abusers to interfere with the scanning process. Abusers often work in groups, allowing for rapid proliferation of technology to bypass scanning, rendering such scanning ineffective. Therefore, taking into account the limited efficacy, and the negative impact on citizens' fundamental rights, detection orders should not be applicable to end-to-end encrypted communications.
2023/05/08
Committee: FEMM
Amendment 103 #
Proposal for a regulation
Recital 27
(27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers detection technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detection orders addressed to them. The European Data Protection Board should be consulted on those technologies and the ways in which they should be best deployed to ensure compliance with applicable rules of Union law on the protection of personal data. The advice of the European Data Protection Board should be taken into account by the EU Centre when compiling the lists of available technologies and also by the Commission when preparing guidelines regarding the application of the detection obligations. The providers may operate the technologies made available by the EU Centre or by others or technologies that they developed themselves, as long as they meet the requirements of this Regulation.
2023/05/08
Committee: FEMM
Amendment 106 #
Proposal for a regulation
Recital 30
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection,ongoing investigation and prosecution of specific child sexual abuse offences.
2023/05/08
Committee: FEMM
Amendment 107 #
Proposal for a regulation
Recital 32
(32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited. For legal and practical reasons, it may not be reasonably possible to have those providers remove or disable access to the material, not even through cooperation with the competent authorities of the third country where they are established. Therefore, in line with existing practices in several Member States, it should be possible to require providers of internet access services to take reasonable measures to block the access of users in the Union to the material. However, blocking measures are easily bypassed, and do not prevent access from outside of the Union, meaning victims have to live knowing that abuse material depicting them remains online, therefore every effort should be taken to remove material, even outside of the jurisdiction of the Union, before resorting to blocking.
2023/05/08
Committee: FEMM
Amendment 109 #
Proposal for a regulation
Recital 33
(33) In the interest of consistency, efficiency and effectiveness and to minimise the risk of circumvention, such blocking orders should be based on the list of uniform resource locators, leading to specific items of verified child sexual abuse, compiled and provided centrally by the EU Centre on the basis of diligently verified submissions by the relevant authorities of the Member States. In order to avoid the taking of unjustified or disproportionate measures, especially those that would unduly affect the fundamental rights at stake, notably, in addition to the rights of the children, the users’ freedom of expression and information and the providers’ freedom to conduct a business, appropriate limits and safeguards should be provided for. In particular, it should be ensured that the burdens imposed on the providers of internet access services concerned are not unreasonable, that the need for and proportionality of the blocking orders is diligently assessed also after their issuance and that both the providers and the users affected have effective means of judicial as well as non- judicial redress. Blocking by uniform resource locator is not technically possible, and most blocking is implemented at the level of the web domain, or Internet Protocol address, which often results in significant overblocking, therefore the EU Centre should evaluate the risk and impact of overblocking before making a final decision on blocking.
2023/05/08
Committee: FEMM
Amendment 118 #
Proposal for a regulation
Recital 37
(37) To ensure the efficient management of such victim support functions, victims should be allowed to contact and rely on the Coordinating Authority that is most accessible to them, which should channel all communications between victims and the EU Centre. Coordinating authorities should provide gender- and age- sensitive support to victims, as well as psychological support. Under no circumstances should victims be blamed for what has happened to them.
2023/05/08
Committee: FEMM
Amendment 128 #
Proposal for a regulation
Recital 61
(61) The EU Centre should provide reliable information on which activities can reasonably be considered to constitute online child sexual abuse, so as to enable the detection and blocking thereof in accordance with this Regulation. Given the nature of child sexual abuse material, that reliable information needs to be provided without sharing the material itself. Therefore, the EU Centre should generate accurate and reliable indicators, based on identified child sexual abuse material and solicitation of children submitted to it by Coordinating Authorities in accordance with the relevant provisions of this Regulation. These indicators should allow technologies to detect the dissemination of either the same material (known material) or of different child sexual abuse material (new material), or the solicitation of children, as applicable.
2023/05/08
Committee: FEMM
Amendment 144 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
(ba) 'safety assistant' means a tool integrated into interpersonal communications services either voluntarily or following a preventative detection order, and active only for child users of the service, which assists children in learning about, identifying and avoiding risks online, including but not limited to self-generated abuse material and solicitation;
2023/05/08
Committee: FEMM
Amendment 157 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
- the integration of tools such as safety assistants to prevent child sexual abuse online;
2023/05/08
Committee: FEMM
Amendment 166 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii a (new)
(iiia) the existing measures to mitigate risks that functionalities of the application will be used for the solicitation of children, or for the sharing of abuse material, including but not limited to safety assistants, and safe defaults for visibility and reachability of children on the platform;
2023/05/08
Committee: FEMM
Amendment 180 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
(ca) Providing users of interpersonal communications services, in particular children, with tools to help them learn about, identify and avoid online risks, in particular through the integration of safety assistants.
2023/05/08
Committee: FEMM
Amendment 187 #
Proposal for a regulation
Article 4 – paragraph 2 – point d a (new)
(da) only introduced following an assessment of the risks the mitigating measures themselves pose for users, in particular if these risks would disproportionately negatively affect persons on the basis of sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender or sexual orientation;
2023/05/08
Committee: FEMM
Amendment 188 #
Proposal for a regulation
Article 4 – paragraph 2 – point d b (new)
(db) developed in cooperation with children who use the service;
2023/05/08
Committee: FEMM
Amendment 196 #
Proposal for a regulation
Article 4 – paragraph 5 a (new)
5a. To complement the risk mitigation measures taken by the providers, gender- sensitive and child-friendly education and prevention measures shall be implemented.
2023/05/08
Committee: FEMM
Amendment 197 #
Proposal for a regulation
Article 6
Obligations for software application 1. Providers of software application stores shall: (a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation of children; (b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; (c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b). 2. In assessing the risk referred to in paragraph 1, the provider shall take into account all the available information, including the results of the risk assessment conducted or updated pursuant to Article 3. 3. Providers of software application stores shall make publicly available information describing the process and criteria used to assess the risk and describing the measures referred to in paragraph 1. That description shall not include information that may reduce the effectiveness of the assessment of those measures. 4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.Article 6 deleted stores
2023/05/08
Committee: FEMM
Amendment 207 #
Proposal for a regulation
Article 6 a (new)
Article6a Obligations concerning age verification and for software application stores 1. Providers of software application stores considered as gatekeepers under the Digital Markets Act (EU) 2022/1925 shall: (a) indicate if applications contain features that could pose a risk to children; (b) indicate if measures have been taken to mitigate risks for children, and which measures have been taken; (c) provide guidance for parents on how to discuss risks with their children; (d) provide application developers with an open-source software library that enables age verification requests from inside applications both to European Digital Identity Wallets and third-party services; (e) provide, free of charge, an age- verification service that can respond to age verification requests from inside applications. 2. Providers of European Digital Identity Wallets under the Regulation (EU) No XXX/2023 establishing a framework for a European Digital Identity shall ensure European Digital Identity Wallets can respond to age verification requests from applications without revealing the identity of the user. 3. Third-party age verification services used to fulfil the obligations of this article shall: (a) only retain user personal data for the purpose of fulfilling future requests, and with the explicit consent of the user; (b) Only retain data vital to process future verification request, namely: i. a pseudonymous means of authenticating the user; and ii. the users previously verified date of birth. (c) only use this data of the purpose of age verification; (d) fulfil requests for the deletion of this data pursuant to the GDPR; 4. Where Developers of applications have identified a significant risk of use of the service concerned for the purpose of the solicitation of children, they shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to put in place safeguards, namely: (a) take reasonable measures to mitigate the risk, such as adapting the services to children, integrating a safety assistant or modifying or adding safeguards limiting access to certain features; (b) provide children with guidance on risks that will help them identify dangers and make more informed decisions; (c) where the application is manifestly unsuitable for children and cannot be adapted, prevent access. 5. Age verification mechanisms set out in this article shall not be used for the purposes of enabling or facilitating parental control technologies that give access to children’s private communications without their consent.
2023/05/08
Committee: FEMM
Amendment 211 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
1a. The Coordinating Authority of establishment shall choose one of the following types of detection order: (a). proactive detection orders, which detect and report known child sexual abuse material under the measures specified in Article 10; (b). preventative detection orders, which detect solicitation and attempts by children to share self-generated abuse material, and assist them in avoiding risks, under the measures specified in Article 10;
2023/05/08
Committee: FEMM
Amendment 236 #
Proposal for a regulation
Article 7 – paragraph 6
6. As regards detection orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the dissemination of new child sexual abuse material; (c) for services other than those enabling the live transmission of pornographic performances as defined in Article 2, point (e), of Directive 2011/93/EU: (1) a detection order concerning the dissemination of known child sexual abuse material has been issued in respect of the service; (2) the provider submitted a significant number of reports concerning known child sexual abuse material, detected through the measures taken to execute the detection order referred to in point (1), pursuant to Article 12.deleted
2023/05/08
Committee: FEMM
Amendment 242 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1 – introductory part
As regards preventative detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met:
2023/05/08
Committee: FEMM
Amendment 262 #
Proposal for a regulation
Article 8 – paragraph 1 – point d a (new)
(da) the type of detection order;
2023/05/08
Committee: FEMM
Amendment 280 #
Proposal for a regulation
The European Parliament rejects the Commission proposal (COM(2022)0209).
2023/07/28
Committee: LIBE
Amendment 283 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that are not end-to end encrypted, and that have received a proactive detection order shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
2023/05/08
Committee: FEMM
Amendment 284 #
Proposal for a regulation
Article 10 – paragraph 1 a (new)
1a. 1a. Providers of hosting services and providers of interpersonal communication services that have received a preventative detection order shall execute it by integrating technologies into the software used to access their services in order to: (a). detect when children attempt to use their services to send intimate images, and provide: (i). guidance on the risks of sharing intimate images, in particular with strangers, and a design that discourages sharing; (ii). a disclaimer about the potential illegality of sharing intimate images, even with partners, when they are under 18; (iii). specific measures to reduce the likelihood images will be reshared, such as preventing images from leaving the software application, disallowing forwarding of images, and preventing screenshots; (iv). a “help” button, displayed prominently on any intimate images sent which the sender can use to retract the image, to seek help and advice, or to request the image hash be sent to a takedown service; (b). detect potential attempted solicitation of children using their services, and provide: (i). an age-appropriate warning about the conversation, strongly advising against continuing the conversation, and in particular, against sharing photos and personal information, and encouraging the child user to speak to a trusted adult; (ii). guidance for trusted adults in discussing the attempted solicitation, with an emphasis on building a relationship of trust between the child and the trusted adult; (iii). where the child does not feel comfortable sharing the conversation with a trusted adult, an option to ask moderators of the service for their advice; (iv). an option to block or report the user.
2023/05/08
Committee: FEMM
Amendment 285 #
Proposal for a regulation
Article 10 – paragraph 1 b (new)
1b. 1b. Technologies used in preventative detection orders to detect grooming shall only report detection in cases where the potential victim, trusted adult, or moderator explicitly choose to. Where end-to-end encryption is used the detection should be done entirely on the users’ device.
2023/05/08
Committee: FEMM
Amendment 286 #
Proposal for a regulation
Article 10 – paragraph 1 c (new)
1c. 1c. Technologies used in preventative detection orders to detect when children attempt to use their services to send intimate images shall not report these users in any way, Where end- to-end encryption is used the detection should be done entirely on the users’ device.
2023/05/08
Committee: FEMM
Amendment 287 #
Proposal for a regulation
Article 10 – paragraph 1 d (new)
1d. 1d. The Coordinating Authority shall be empowered to request services take further preventative measures so long as those measures do not involve reporting, and only after approval by the relevant Data Protection Authority.
2023/05/08
Committee: FEMM
Amendment 322 #
Proposal for a regulation
Article 15 – paragraph 4 – subparagraph 1
The Coordinating Authority of establishment may request, when requesting the judicial authority or independent administrative authority issuing the removal order, and after having consulted with relevant public authorities, that the provider is not to disclose any information regarding the removal of or disabling of access to the child sexual abuse material, where and to the extent necessary to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
2023/05/08
Committee: FEMM
Amendment 331 #
Proposal for a regulation
Article 16 – paragraph 1
1. TOnce all other means available to remove abuse material have been exhausted, the Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or an independent administrative authority of that Member State to issue a blocking order requiring a provider of internet access services under the jurisdiction of that Member State to take reasonable measures to prevent users from accessing known child sexual abuse material indicated by all uniform resource locators on the list of uniform resource locators included in the database of indicators, in accordance with Article 44(2), point (b) and provided by the EU Centre.
2023/05/08
Committee: FEMM
Amendment 338 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2 – point d a (new)
(da) the content is hosted outside of the European Union or territories under the jurisdiction of its member states, by an entity that has no legal representative in the European Union or territories under the jurisdiction of its member states;
2023/05/08
Committee: FEMM
Amendment 339 #
Proposal for a regulation
Article 16 – paragraph 2 – subparagraph 2 – point d b (new)
(db) the coordinating authority, EU centre and national law enforcement organisations have taken all possible measures to have the content removed, including: i. contacting the hosting service where the material is stored in order to request removal; ii. contacting law enforcement in the country where the content is hosted to request their assistance in removing the material;
2023/05/08
Committee: FEMM
Amendment 352 #
Proposal for a regulation
Article 16 – paragraph 6 – subparagraph 2
The period of application of blocking orders shall not exceed five yearstwo years. Should the Coordinating Authority wish to renew or extend blocking, they must show that renewed attempts have been made to have the content removed within four months of the renewal or extension.
2023/05/08
Committee: FEMM
Amendment 414 #
Proposal for a regulation
Article 21 – paragraph 4 a (new)
4a. The EU Centre shall provide a “Take it Down” service which: allows victims to flag abuse material depicting them, and store a fingerprint of that material in a database and allows participating interpersonal communications services and hosting services, including social networks, to voluntarily check images uploaded to their platforms against this database. Participating services shall: (a). take the following measures when a match is found: (i). inform the uploader that the image they are attempting to upload has been identified as child sexual abuse material, and prevent upload; (ii). give the uploader the option to contest the flagging, forwarding the image and fingerprint on to the EU centre for further analysis; (iii). allow the uploader to provide further information to the EU Centre on the origin of the image. (b). state clearly that uploads are checked against a database of known abuse material; (c). provide anonymised statistics to the EU centre on the number of times an upload of an image with a certain hash was attempted.
2023/05/08
Committee: FEMM
Amendment 445 #
Proposal for a regulation
Article 34 – paragraph 3 – subparagraph 1 a (new)
Users shall have the right to be informed of the outcome of the complaint.
2023/05/08
Committee: FEMM
Amendment 477 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 a (new)
(6a) facilitate and coordinate cooperation, including information sharing, with international law enforcement organisations, law enforcement authorities in third countries, in respect of Data Protection rules;
2023/05/08
Committee: FEMM
Amendment 478 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1). This shall include technologies both for preventative and proactive detection orders.
2023/05/08
Committee: FEMM
Amendment 496 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market. by persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
2023/07/28
Committee: LIBE
Amendment 499 #
Proposal for a regulation
Article 56 – paragraph 1
1. The Management Board shall be composed of one representative from each Member State and, two representatives of the Commission,a representative of the European Parliament, and a representative from the European Data Protection Board, all as members with voting rights.
2023/05/08
Committee: FEMM
Amendment 502 #
Proposal for a regulation
Article 56 – paragraph 2 – subparagraph 1
The Management Board shall also include one independent expert observer designated by the European Parliament, without the right to vote.deleted
2023/05/08
Committee: FEMM
Amendment 504 #
Proposal for a regulation
Article 56 – paragraph 2 – subparagraph 2
Europol mayshall designate a representative to attend the meetings of the Management Board as an observer on matters involving Europol, at the request of the Chairperson of the Management Board.
2023/05/08
Committee: FEMM
Amendment 524 #
Proposal for a regulation
Article 66 – paragraph 1
1. The Technology Committee shall consist of technical experts appointed by the Management Board in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union, as well as a representative from the European Data Protection Board.
2023/05/08
Committee: FEMM
Amendment 594 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) “person suspected of being involved in child sexual abuse” means an identified individual person about whom verifiable adequate evidence exists, which gives rise to the suspicion that that person has committed a child sexual abuse offence, attempted to commit a child sexual abuse offence, or prepared by committing a criminal offence to commit a child sexual abuse offence;
2023/07/28
Committee: LIBE
Amendment 596 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(qb) 'person disqualified from exercising activities involving children' means an identified individual person, who, in line with Article 10 of Directive 2011/93/EU, is temporarily or permanenently disqualified from exercising activities involving direct and regular contacts with children;
2023/07/28
Committee: LIBE
Amendment 807 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take threasonable and proportionate mitigation measures.
2023/07/28
Committee: LIBE
Amendment 868 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
(c) take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the measures referred to in point (b).deleted
2023/07/28
Committee: LIBE
Amendment 1128 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that have received a detection order concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
2023/07/28
Committee: LIBE
Amendment 1266 #
Proposal for a regulation
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1)courts identified as constituting child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1294 #
Proposal for a regulation
Chapter II – Section 5
[...]deleted
2023/07/28
Committee: LIBE
Amendment 1332 #
Proposal for a regulation
Article 19 a (new)
Article19a Respect to Privacy Nothing in this Regulation shall be interpreted as a requirement to 1. break cryptography; 2. scan content on users’ devices; 3. restrict anonymous access to online services and software applications.
2023/07/28
Committee: LIBE
Amendment 1698 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1) concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
2023/07/28
Committee: LIBE
Amendment 1701 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
To that aim, the EU Centre shall compile lists of such technologies, having regard to the requirements of this Regulation and in particular those of Article 10(2) and Article 19a (new).
2023/07/28
Committee: LIBE