BETA

12 Amendments of Carles PUIGDEMONT I CASAMAJÓ related to 2022/0155(COD)

Amendment 297 #
Proposal for a regulation
Recital 3
(3) Member States and regional authorities are increasingly introducing, or are considering introducing, national and regional laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, may have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
2023/07/28
Committee: LIBE
Amendment 332 #
Proposal for a regulation
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation2022/2065 may consider to which extent mitigation measures adopted to comply with that obligation. Mitigation measures may include designing their online interfaces or parts thereof with the highest level of privacy, safety and security for children by default or adopting standards for protection of children, or participating in codes of conduct for protecting children, targeted measures to protect the rights of the child, including functionalities enabling age assurance and age scoring, and age-appropriate parental control tools. Enabling flagging and/or notifying mechanisms and self-reporting functionalities may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation. Communication service providers (CSPs) may take voluntary measures to detect and remove child sexual abuse material (CSAM) from their services, provided that such measures are proportionate, necessary, and respectful of users' privacy and other fundamental rights. CSPs that take such measures shall comply with the following requirements:(a) Any measures taken to detect or remove CSAM must be based on clear, transparent, and publicly available criteria, which should be regularly reviewed and updated as necessary to reflect changes in technology and legal developments.(b) CSPs shall inform their users about the nature and extent of the measures taken to detect and remove CSAM, including any impact on users' privacy and other rights.(c) CSPs shall ensure that any measures taken to detect and remove CSAM are subject to appropriate oversight and accountability mechanisms, which should be designed to ensure that the measures are effective, proportionate, and respectful of users' rights. (d) CSPs shall cooperate with relevant competent authorities, including law enforcement authorities, to prevent and combat CSAM, and to support the identification and rescue of victims of child sexual exploitation and abuse.
2023/07/28
Committee: LIBE
Amendment 387 #
Proposal for a regulation
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
2023/07/28
Committee: LIBE
Amendment 390 #
Proposal for a regulation
Recital 26 a (new)
(26a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any weakening of the end-to-end encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or weakening end-to-end encryption. However, to the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, providers should be authorised by the competent judicial authority or another independent administrative authority to process metadata that can detect suspicious patterns of behaviour without having access to the content of the encrypted communication.
2023/07/28
Committee: LIBE
Amendment 440 #
Proposal for a regulation
Recital 49
(49) In order to verify that the rules of this Regulation, in particular those on mitigation measures and on the execution of voluntary detection orders, removal orders or blockdetection, removal, blocking or delisting orders that it issued, are effectively complied in practice, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concerned.
2023/07/28
Committee: LIBE
Amendment 441 #
Proposal for a regulation
Recital 49 a (new)
(49a) Detection orders, which would require communication service providers to monitor their users' online activities for the purpose of detecting child sexual abuse material (CSAM), should only be imposed as a last resort in cases where a provider is found to be acting in bad faith and failing to cooperate with competent authorities. The use of detection orders should be proportionate, necessary, and subject to strict safeguards, and should only be authorized by a judicial authority or other independent oversight body. In any case, users should not be punished for merely using a communication service, and any measures taken to detect or remove CSAM should be implemented in a manner that respects users' privacy and other fundamental rights.
2023/07/28
Committee: LIBE
Amendment 874 #
Article6a Encrypted services and metadata processing 1. Nothing in this Regulation shall be interpreted as prohibiting or weakening end-to-end encryption. 2. On the basis of the risk assessment submitted and, where applicable, further information, the Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to authorise a provider of hosting services or a provider of interpersonal communications services to process metadata to the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse. When assessing whether to request the processing of metadata, the Coordinating Authority shall take into account any interference with the rights to privacy and data protection of the users of the service that such a processing entails and determine whether, in that case, the processing of metadata would be effective in mitigating the risk of use of the service for the purpose of child sexual abuse, and that it is strictly necessary and proportionate. 3. Without prejudice to Regulation (EU) 2016/679, providers shall inform the users of such processing in their terms and conditions, including information on the possibility to submit complaints to the competent data processing authorities concerning the relevant processing and on the avenues for judicial redress.
2023/07/28
Committee: LIBE
Amendment 1163 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
(da) not able to weaken end-to-end encryption.
2023/07/28
Committee: LIBE
Amendment 1189 #
Proposal for a regulation
Article 10 – paragraph 4 – point f a (new)
(fa) ensure privacy by design and by default and, where applicable, without hampering the integrity of encryption.
2023/07/28
Committee: LIBE
Amendment 1341 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
Persons residing in the UnionVictims shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where they reside, information regarding any instances where the dissemination of known child sexual abuse material depicting them is reported to the EU Centre pursuant to Article 12. Persons with disabilities shall have the right to ask and receive such an information in a manner accessible to them.
2023/07/28
Committee: LIBE
Amendment 1477 #
Proposal for a regulation
Article 35 – paragraph 4 a (new)
4a. Member States shall ensure that penalties imposed for the infringement of this Regulation do not encourage the over reporting or the removal of material which does not constitute child sexual abuse material.
2023/07/28
Committee: LIBE
Amendment 1649 #
Proposal for a regulation
Article 46 – paragraph 6 – subparagraph 2
The EU Centre shall diligently assess those requests and only grant access where it considers that the requested access is necessary for and proportionate to the specified purpose, and in accordance with Union law.
2023/07/28
Committee: LIBE