BETA

Activities of Jadwiga WIŚNIEWSKA related to 2022/0155(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
2023/06/28
Committee: FEMM
Dossiers: 2022/0155(COD)
Documents: PDF(331 KB) DOC(193 KB)
Authors: [{'name': 'Heléne FRITZON', 'mepid': 197391}]

Amendments (65)

Amendment 60 #
Proposal for a regulation
Recital 2 a (new)
(2a) Abuse, exploitation and sexual violence against children are becoming more and more commonplace due to fast and advanced ICT (such as webcams, live streaming, social media platforms or computer games); there is a differentiation between genders in terms of online child sexual abuse, where girls are the main target group and are two to three times more vulnerable to sexual abuse than boys; we should note that statistics on the abuse of boys are often underestimated and such cases are less frequently reported;
2023/05/08
Committee: FEMM
Amendment 65 #
Proposal for a regulation
Recital 4
(4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is effective and that respects the fundamental rights of all parties concerned. In view of the fast-changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future- proof manner, so as not to hamper innovation.
2023/05/08
Committee: FEMM
Amendment 289 #
Proposal for a regulation
Recital 2
(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services. Considering the importance of the right to privacy, including the protection of personal data, as guaranteed by the Charter of Fundamental Rights, nothing in this regulation should be interpreted in a way that would enable future broad based mass surveillance.
2023/07/28
Committee: LIBE
Amendment 323 #
Proposal for a regulation
Recital 13 a (new)
(13a) In order to protect children, this Regulation should take into account the concerning hypersexualized use of children's images in adverstising campaigns and the increasing spread of cultural pseudo-pedophilia also fuelled by fundraising campaigns.
2023/07/28
Committee: LIBE
Amendment 335 #
Proposal for a regulation
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and, parental control tools and functionalities enabling self-reporting by children, their parents or legal guardians, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
2023/07/28
Committee: LIBE
Amendment 384 #
Proposal for a regulation
Article 19 – paragraph 1
Providers of relevant information society services shall not be legally liable for child sexual abuse offences solely becausif in good faith and with due diligence they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocking or reporting online child sexual abuse in accordance with those requirements.
2023/05/08
Committee: FEMM
Amendment 384 #
Proposal for a regulation
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Nothing in this Regulation should therefore be interpreted as prohibiting end-to-end encryption or making it impossible. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users, while ensuring the effective detection of online child sexual abuse and the balance of all the fundamental rights at stake.
2023/07/28
Committee: LIBE
Amendment 385 #
Proposal for a regulation
Article 20 – title
Victims’ right to information and access to support
2023/05/08
Committee: FEMM
Amendment 386 #
Proposal for a regulation
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of childrenNothing in this Regulation should therefore be interpreted as prohibiting end-to-end encryption or making it impossible or leading to any form of general monitoring. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users. Under no circumstances should this Regulation be interpreted or used as an instrument of mass surveillance and monitoring.
2023/07/28
Committee: LIBE
Amendment 389 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
Persons residing in the Union shall have the right to receive, upon their request or on the request of their legal guardian or legal representative, from the Coordinating Authority designated by the Member State where they reside, age-appropriate information regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them.
2023/05/08
Committee: FEMM
Amendment 391 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1 a (new)
Persons residing in the Union shall have the right to receive, upon their request or on the request of their legal guardian or legal representative, from the Coordinating Authority designated by the Member State where they reside, information about universal services and victim support services, taking into consideration their age and gender. Persons with disabilities shall have the right to request such information and receive it in a manner which is accessible to them.
2023/05/08
Committee: FEMM
Amendment 404 #
Proposal for a regulation
Recital 28
(28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliable, as well as to identify false positivesdo not produce too many false positives identifying the reasons for their appearance, and avoid to the extent erroneous reporting to the EU Centre, providers should ensure stringent human oversight and, where necessary and required to uphold the highest possible standards, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular and independent assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential solicitation of children.
2023/07/28
Committee: LIBE
Amendment 405 #
Proposal for a regulation
Article 21 – paragraph 1
1. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union thatassistance to persons residing in the Union, upon their request or on the request of their legal guardian or legal representative, when they seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
2023/05/08
Committee: FEMM
Amendment 407 #
(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that they may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potential online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonable grounds to believe that a particular activity may constitute online child sexual abuse. In such a case, hosting providers and providers of publicly available interpersonal communication services should be required to secure the disclosed child sexual abuse material and any metadata they hold about that material, including metadata which may indicate the author of the file, the time and circumstances of its creation and the modifications made. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest against child sexual abuse, or activities conducted on the providers’ own initiative. Those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them. In the event of an investigation, providers should provide any electronic evidence in their possession, as indicated above, upon request by law enforcement authorities.
2023/07/28
Committee: LIBE
Amendment 409 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Persons residing in the Union shall have the right to receive, upon their request or on the request of their legal guardian or legal representative, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
2023/05/08
Committee: FEMM
Amendment 411 #
Proposal for a regulation
Recital 29
(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that they may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potential online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonable grounds to believe that a particular activity may constitute online child sexual abuse. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest against child sexual abuse, or activities conducted on the providers’ own initiativThe providers can obtain such actual knowledge or awareness, inter alia, through its own initiative investigations, as well as through information flagged or notified by users, self-reported by victims or organizations, such as hotlines, acting in the public interest against child sexual abuse. Those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them so that competent law enforcement authorities can focus on reports that are most likely to lead to recovery of a child, the arrest of an offender, or both.
2023/07/28
Committee: LIBE
Amendment 412 #
Proposal for a regulation
Recital 29 a (new)
(29a) It is also crucial that hosting providers and providers of publicly available interpersonal communication services cooperate with law enforcement in relation to the detection of potential online child abuse and the possession of key electronic evidence necessary for the proper prosecution of child sexual abuse cases. Therefore, in order to ensure the effective use of secured child sexual abuse material, it is necessary to legally ensure that providers secure not only the media files and instant messaging content themselves, but also their metadata. Metadata is information about documents/files relating to their content, technical and physical parameters. It also includes information such as the time and place of their creation, information about the devices used in their creation, and about the modifications made to the files. It is reasonable to expect service providers, in the event of the disclosure of child sexual abuse content, to secure it and then hand over, at the request of law enforcement authorities, any data indicated above that constitute electronic evidence in the case. It should be stressed that metadata can constitute important evidence, which will be important for law enforcement in the course of an investigation, and its ephemeral and easily modifiable nature requires it to be secured immediately, as it can contribute to the identification not only of the perpetrator and other persons linked to the uploaded content, but also of the victims
2023/07/28
Committee: LIBE
Amendment 413 #
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection,. Any removal or disabling of access should respect the fundamental rights of the users of the service, including the right to freedom of expression and of information. Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
2023/07/28
Committee: LIBE
Amendment 415 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Providers of hosting services and providers of interpersonal communications services shall preserve the necessary content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
2023/05/08
Committee: FEMM
Amendment 416 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – point b
(b) reporting information concerning potential online child sexual abuse to the EU Centre pursuant to Article 12;
2023/05/08
Committee: FEMM
Amendment 424 #
Proposal for a regulation
Article 25 – paragraph 5 a (new)
5a. Each Member State shall ensure that a section or department is designated or established within the Coordinating Authority’s office, responsible for creating and disseminating information campaigns aimed at raising awareness amongst the public, especially children, with particular consideration of the gender and age of the potential recipients. The creation and dissemination process should be implemented in consultation and collaboration with the appropriate and competent national bodies.
2023/05/08
Committee: FEMM
Amendment 426 #
Proposal for a regulation
Article 25 – paragraph 7 – point a a (new)
(aa) providing information in terms of expertise and developed techniques for preventing online child abuse and the online dissemination of materials depicting sexual child abuse, with particular consideration of age and gender;
2023/05/08
Committee: FEMM
Amendment 428 #
Proposal for a regulation
Article 25 – paragraph 7 – point a b (new)
(ab) providing support in developing preventative measures, including: public- awareness-raising campaigns, programmes to improve digital skills and skills in terms of using the Internet and online safety, ensuring support and access to specialist services and support services for child sexual abuse victims and children in difficult situations.
2023/05/08
Committee: FEMM
Amendment 434 #
Proposal for a regulation
Article 25 – paragraph 9 a (new)
9a. When communicating with or making decisions affecting the victims or persons in high-risk groups, the coordinating body should fully respect human and civil rights of dignity and privacy, as well as take into consideration the gender and age of the victim or party involved.
2023/05/08
Committee: FEMM
Amendment 464 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – introductory part
(6) facilitate the generation and sharing of knowledge withof knowledge, the development of tools and techniques and their sharing with and between other Union institutions, bodies, offices and agencies, Coordinating Authorities or other relevant authorities of the Member States to contribute to the achievement of the objective of this Regulation, by:
2023/05/08
Committee: FEMM
Amendment 466 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point a
(a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise, with particular consideration of age and gender, on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51;
2023/05/08
Committee: FEMM
Amendment 468 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b
(b) supporting the development and dissemination of research and expertise on those matters, with particular consideration of age and gender, on matters concerning the prevention and combating of online child sexual abuse and on assistance to victims, including by serving as a hub of expertise to support evidence-based policy;
2023/05/08
Committee: FEMM
Amendment 475 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 a (new)
6a) When communicating with or making decisions affecting the victims or persons in high-risk groups, the EU Centre should fully respect human and civil rights of dignity and privacy, as well as take into consideration the gender and age of the victim or party involved.
2023/05/08
Committee: FEMM
Amendment 483 #
Proposal for a regulation
Article 50 – paragraph 2 – introductory part
2. The EU Centre shall collect, record, analyse and make available relevantto the competent national and EU bodies relevant, anonymous, objective, reliable and, comparable information, taking into consideration age and gender, on matters related to the prevention and combating of child sexual abuse, in particular:
2023/05/08
Committee: FEMM
Amendment 488 #
3. Where necessary for the performance of its tasks under this Regulation, the EU Centre shall carry out, participate in or encourage and support research, surveys and studies, taking into consideration age and gender, either on its own initiative or, where appropriate and compatible with its priorities and its annual work programme, at the request of the European Parliament, the Council or the Commission.
2023/05/08
Committee: FEMM
Amendment 492 #
Proposal for a regulation
Article 50 – paragraph 5
5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. All EU Centre actions should take particular account of age, should be adapted to the specifics and variable requirements of the gender of the recipients, and should fully respect personal dignity and privacy.
2023/05/08
Committee: FEMM
Amendment 495 #
Proposal for a regulation
Article 54 – paragraph 1
1. Where necessary for the performance of its tasks under this Regulation, the EU Centre mayshould cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations and semi-public organisations.
2023/05/08
Committee: FEMM
Amendment 497 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to prevent and address the misuse of relevant information society services for online child sexual abuse in the internal market.
2023/07/28
Committee: LIBE
Amendment 498 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to prevent and address the misuse of relevant information society services for online child sexual abuse in the internal market.
2023/07/28
Committee: LIBE
Amendment 508 #
Proposal for a regulation
Article 56 – paragraph 4
4. Members of the Management Board and their alternates shall be appointed in the light of their knowledge in the field of combating child sexual abuse, taking into account objective and neutral sexual criteria, relevant managerial, administrative and budgetary skills. Member States shall appoint a representative of their Coordinating Authority, within four months of [date of entry into force of this Regulation]. All parties represented in the Management Board shall make efforts to limit turnover of their representatives, in order to ensure continuity of its work. All parties shall aim to achieve a balanced representation between men and women on the Management Board.
2023/05/08
Committee: FEMM
Amendment 516 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
(da) obligations on providers of online search engines to delist websites which were determined to host child sexual abuse material;
2023/07/28
Committee: LIBE
Amendment 521 #
Proposal for a regulation
Article 64 – paragraph 4 – point p
(p) fostering recruitment of appropriately skilled and experienced EU Centre staff, while ensuring gender balancebased on objective and sex- neutral criteria..
2023/05/08
Committee: FEMM
Amendment 523 #
Proposal for a regulation
Article 66 – paragraph 1
1. The Technology Committee shall consist of technical experts appointed by the Management Board in view of their expertise and skills, taking into account objective and neutral sexual criteria and in view of their excellence and their independence, following the publication of a call for expressions of interest in the Official Journal of the European Union.
2023/05/08
Committee: FEMM
Amendment 523 #
Proposal for a regulation
Article 1 – paragraph 3 – point b a (new)
(ba) Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online;
2023/07/28
Committee: LIBE
Amendment 527 #
Proposal for a regulation
Article 1 – paragraph 3 – point d a (new)
(da) Directive (EU) 2022/2555 of the European Parliament and of the Council of 14 December 2022 on measures for a high common level of cybersecurity across the Union, amending Regulation (EU) No 910/2014 and Directive (EU) 2018/1972, and repealing Directive (EU) 2016/1148 (NIS 2 Directive)"
2023/07/28
Committee: LIBE
Amendment 528 #
Proposal for a regulation
Article 1 – paragraph 3 – point d a (new)
(da) Directive (EU) 2022/2555 of the European Parliament and the Council of 14 December 2022 on measures for high common level of cybercecurity across the Union, amending Regulation (EU) No 910/2014 and Directive (EU) 2018/1972 and repealing Directive (EU) 2016/1148 (NIS 2 Directive);
2023/07/28
Committee: LIBE
Amendment 531 #
Proposal for a regulation
Article 1 – paragraph 3 a (new)
3a. This regulation shall not have the effect of modifying the obligation to respect the rights, freedom and principles referred to in Article 6 TEU and shall apply without prejudice to fundamental principles relating to the right to private life and family life and to freedom of expression and information;
2023/07/28
Committee: LIBE
Amendment 535 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
3b. Nothing in this Regulation shall be interpreted as prohibiting or weakening end-to-end encryption.
2023/07/28
Committee: LIBE
Amendment 536 #
Proposal for a regulation
Article 83 – paragraph 2 – point a – indent 2
– where the report led to the launch of a criminal investigation or contributed to an ongoing investigation, the state of play or outcome of the investigation, including whether the case was closed at pre-trial stage, whether the case led to the imposition of penalties, whether victims were identified and rescued and if so their numbers differentiating by gendersex and age, and whether any suspects were arrested and any perpetrators were convicted and if so their numbers;
2023/05/08
Committee: FEMM
Amendment 575 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 17 years;deleted
2023/07/28
Committee: LIBE
Amendment 576 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 17 years;deleted
2023/07/28
Committee: LIBE
Amendment 606 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
(wa) "online search engine" means an intermediary service as defined in Article 3, point (j), of Regulation (EU) 2022/2065;
2023/07/28
Committee: LIBE
Amendment 607 #
Proposal for a regulation
Article 2 – paragraph 1 – point w b (new)
(wb) 'hotline' means an organization recognized by its Member State of establishment, which provides either a reporting channel provided by law enforcement authorities, or service for receiving anonymous complaints from victims and the public about alleged child sexual abuse online.
2023/07/28
Committee: LIBE
Amendment 631 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 1 a (new)
- the availability to employ appropriate technical measures - such as parental control tools - to prevent underage access and exposure to inappropriate content or services;
2023/07/28
Committee: LIBE
Amendment 643 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag or notify online child sexual abuse to the provider through tools that are easily accessible and age-appropriate, including already available anonymous reporting channels as provided by Directive (EU) 2019/1937;
2023/07/28
Committee: LIBE
Amendment 652 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
- funcionalities enabling self- reporting by children, their parents or legal guardians.
2023/07/28
Committee: LIBE
Amendment 740 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
(aa) adapting the design, features and functions of their service in order to ensure the highest level of privacy, safety and security by design and by default, in particular, to protect children;
2023/07/28
Committee: LIBE
Amendment 746 #
Proposal for a regulation
Article 4 – paragraph 1 – point a b (new)
(ab) emplying appropriate age measurments - such as parental control tools, to prevent underage access and exposure to inappropriate content or services;
2023/07/28
Committee: LIBE
Amendment 766 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
(ca) enabling users to flag or notify online child sexual abuse to the provider through tools that are easily accessible and age-appropriate, including already anonymous reporting channels;
2023/07/28
Committee: LIBE
Amendment 771 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
(cb) enabling safe self-reporting capabilities for children, their parents or legal guardians.
2023/07/28
Committee: LIBE
Amendment 860 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;exploting children or where the developer of the software application has informed the software application store that its terms of use do not allow child users, the software application has an appropriate age rating model in place, or the developer of the software application has requested the software application store not to allow child users to download its software applications.
2023/07/28
Committee: LIBE
Amendment 905 #
Proposal for a regulation
Article 7 – paragraph 2 a (new)
2a. The grounds for issuing the order shall outweight the negative consequences for the rights and legitimate iterests of all the parties concerned, having regard in particular to the need to endure a fair balance between the fundamental rights of those parties. The order shall be a measure of last resort and shall be issued on the basis of a case-by-case analysis.
2023/07/28
Committee: LIBE
Amendment 1160 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) not able to weaken end-to end encryption and to lead to a general monitoring of private comunications.
2023/07/28
Committee: LIBE
Amendment 1188 #
Proposal for a regulation
Article 10 – paragraph 4 – point f a (new)
(fa) ensure privacy without hampering the integrity of encryption and without leading to a general monitoring of private communications.
2023/07/28
Committee: LIBE
Amendment 1230 #
Proposal for a regulation
Article 12 – paragraph 3
3. The provider shall establish and operate an accessible, age-appropriate, child-friendly and user-friendly mechanism, including self-reporting tools, that allows users to flag or notify to the provider potential online child sexual abuse on the services. Those mechanisms shall allow for anonymous reporting already available through anonymous reporting channels as defined by Directive (EU) 2019/1937.
2023/07/28
Committee: LIBE
Amendment 1247 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) allother available data other than content data related to the reported potential online child sexual abuse;, including unique identifiers of the user and metadata related to media files and communications
2023/07/28
Committee: LIBE
Amendment 1259 #
Proposal for a regulation
Article 13 – paragraph 1 – point g a (new)
(ga) whether the provider considers that the report involves and imminent threat to the life or safety of a child or requires urgent action;
2023/07/28
Committee: LIBE
Amendment 1395 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 1
Where Member States shall, by the date referred to in paragraph 1, designatedesignate more than one competent authority, it shall appoint one of those competent authorities as their Coordinating Authority for child sexual abuse issues (‘Coordinating Authority’). Where they designate only one competent authority, that competent authority shall be the Coordinating Authority.
2023/07/28
Committee: LIBE
Amendment 1423 #
Proposal for a regulation
Article 26 – paragraph 2 – point c
(c) are free from any undue external influence, whether direct or indirect in line with their national legislation;
2023/07/28
Committee: LIBE
Amendment 1743 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 1
Europol and the EU Centre shall provide each other with the fullest possible access to relevant information and information systems, where necessary for the performance of their respective tasks and in accordance with the acts of Union law regulating such access. Any access to personal data processed in Europol's information systems, where deemed stricly necessary for the performance of the EU Centre's tasks, shall be granted only case- by-case basis, upon submission of an explicit request, which indicates the specific purpose and justification. Europol shall be required to diligentely assess those requests and only transmit personal data to the EU Centre where strictly necessary and proprotionate to the required purpose.
2023/07/28
Committee: LIBE